Sample records for quantitative approach based

  1. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  2. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic

  3. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  4. A Quantitative Corpus-Based Approach to English Spatial Particles: Conceptual Symmetry and Its Pedagogical Implications

    ERIC Educational Resources Information Center

    Chen, Alvin Cheng-Hsien

    2014-01-01

    The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…

  5. A novel logic-based approach for quantitative toxicology prediction.

    PubMed

    Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E

    2007-01-01

    There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.

  6. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  7. Standardization approaches in absolute quantitative proteomics with mass spectrometry.

    PubMed

    Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo

    2017-07-31

    Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and

  8. Two approaches to improving mental health care: positivist/quantitative versus skill-based/qualitative.

    PubMed

    Luchins, Daniel

    2012-01-01

    The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.

  9. A GIS-based Quantitative Approach for the Search of Clandestine Graves, Italy.

    PubMed

    Somma, Roberta; Cascio, Maria; Silvestro, Massimiliano; Torre, Eliana

    2018-05-01

    Previous research on the RAG color-coded prioritization systems for the discovery of clandestine graves has not considered all the factors influencing the burial site choice within a GIS project. The goal of this technical note was to discuss a GIS-based quantitative approach for the search of clandestine graves. The method is based on cross-referenced RAG maps with cumulative suitability factors to host a burial, leading to the editing of different search scenarios for ground searches showing high-(Red), medium-(Amber), and low-(Green) priority areas. The application of this procedure allowed several outcomes to be determined: If the concealment occurs at night, then the "search scenario without the visibility" will be the most effective one; if the concealment occurs in daylight, then the "search scenario with the DSM-based visibility" will be most appropriate; the different search scenarios may be cross-referenced with offender's confessions and eyewitnesses' testimonies to verify the veracity of their statements. © 2017 American Academy of Forensic Sciences.

  10. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  11. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Advantages and limitations of quantitative PCR (Q-PCR)-based approaches in microbial ecology.

    PubMed

    Smith, Cindy J; Osborn, A Mark

    2009-01-01

    Quantitative PCR (Q-PCR or real-time PCR) approaches are now widely applied in microbial ecology to quantify the abundance and expression of taxonomic and functional gene markers within the environment. Q-PCR-based analyses combine 'traditional' end-point detection PCR with fluorescent detection technologies to record the accumulation of amplicons in 'real time' during each cycle of the PCR amplification. By detection of amplicons during the early exponential phase of the PCR, this enables the quantification of gene (or transcript) numbers when these are proportional to the starting template concentration. When Q-PCR is coupled with a preceding reverse transcription reaction, it can be used to quantify gene expression (RT-Q-PCR). This review firstly addresses the theoretical and practical implementation of Q-PCR and RT-Q-PCR protocols in microbial ecology, highlighting key experimental considerations. Secondly, we review the applications of (RT)-Q-PCR analyses in environmental microbiology and evaluate the contribution and advances gained from such approaches. Finally, we conclude by offering future perspectives on the application of (RT)-Q-PCR in furthering understanding in microbial ecology, in particular, when coupled with other molecular approaches and more traditional investigations of environmental systems.

  13. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  14. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  15. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Liquid crystal-based biosensor with backscattering interferometry: A quantitative approach.

    PubMed

    Khan, Mashooq; Park, Soo-Young

    2017-01-15

    We developed a new technology that uses backscattering interferometry (BSI) to quantitatively measure nematic liquid crystal (NLC)-based biosensors, those usually relied on texture reading for on/off signals. The LC-based BSI comprised an octadecyltrichlorosilane (OTS)-coated square capillary filled with 4-cyano-4'-pentylbiphenyl (5CB, a nematic LC at room temperature). The LC/water interface in the capillary was functionalized by a coating of poly(acrylicacid-b-4-cyanobiphenyl-4'-oxyundecylacrylate) (PAA-b-LCP) and immobilized with the enzymes glucose oxidase (GOx) and horseradish peroxidase (HRP) through covalent linkage to the PAA chains (5CB PAA-GOx:HRP ) for glucose detection. Laser irradiation of the LC near the LC/water interface resulted in backscattered fringes with high contrast. The change in the spatial position of the fringes (because of the change in the orientation of the LC caused by the GOx:HRP enzymatic reaction of glucose) altered the output voltage of the photodetector when its active area was aligned with the edge of one of the fringes. The change in the intensity at the photodetector allowed the detection limit of the instrument to be as low as 0.008mM with a linear range of 0.02-9mM in a short response time (~60s). This LC-based BSI technique allows for quantitative, sensitive, selective, reproducible, easily obtainable, and interference-free detection in a large linear dynamic range and for practical applications with human serum. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A template-based approach to semi-quantitative SPECT myocardial perfusion imaging: Independent of normal databases.

    PubMed

    Hughes, Tyler; Shcherbinin, Sergey; Celler, Anna

    2011-07-01

    Normal patient databases (NPDs) are used to distinguish between normal and abnormal perfusion in SPECT myocardial perfusion imaging (MPI) and have gained wide acceptance in the clinical environment, yet there are limitations to this approach. This study introduces a template-based method for semi-quantitative MPI, which attempts to overcome some of the NPD limitations. Our approach involves the construction of a 3D digital healthy heart template from the delineation of the patient's left ventricle in the SPECT image. This patient-specific template of the heart, filled with uniform activity, is then analytically projected and reconstructed using the same algorithm as the original image. Subsequent to generating bulls-eye maps for the patient image (PB) and the template image (TB), a ratio (PB/TB) is calculated, which produces a reconstruction-artifact corrected image (CB). Finally, a threshold is used to define defects within CB enabling measurements of the perfusion defect extent (EXT). The SPECT-based template (Ts) measurements were compared to those of a CT-based "ideal" template (TI). Twenty digital phantoms were simulated: male and female, each with one healthy heart and nine hearts with various defects. Four physical phantom studies were performed modeling a healthy heart and three hearts with different defects. The phantom represented a thorax with spine, lung, and left ventricle inserts. Images were acquired on General Electric's (GE) Infinia Hawkeye SPECT/CT camera using standard clinical MPI protocol. Finally, our method was applied to 14 patient MPI rest/stress studies acquired on the GE Infinia Hawkeye SPECT/CT camera and compared to the results obtained from Cedars-Sinai's QPS software. In the simulation studies, the true EXT correlated well with the TI (slope= 1.08; offset = -0.40%; r = 0.99) and Ts (slope = 0.90; offset = 0.27%; r = 0.99) methods with no significant differences between them. Similarly, strong correlations were measured for EXT

  18. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the

  19. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of

  20. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  2. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  3. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  4. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  5. Conventional liquid chromatography/triple quadrupole mass spectrometer-based metabolite identification and semi-quantitative estimation approach in the investigation of dabigatran etexilate in vitro metabolism

    PubMed Central

    Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey

    2012-01-01

    Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178

  6. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  7. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  8. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  9. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    DTIC Science & Technology

    2010-10-28

    design , and the time series with nonequivalent control group design . The experimental research approach will randomly assign participants...Leedy & Ormrod, 2005). According to Fife- Schaw (2006) there are three quasi-experimental designs : the nonequivalent control group design , the time...that have controlled and isolated variables. A specific quantitative approach available to the researcher is the use of surveys. Surveys, in

  10. Quantitative approaches in climate change ecology

    PubMed Central

    Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.

  11. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    PubMed

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  12. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  13. Enhancing quantitative approaches for assessing community resilience

    USGS Publications Warehouse

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  14. Enhancing quantitative approaches for assessing community resilience.

    PubMed

    Chuang, W C; Garmestani, A; Eason, T N; Spanbauer, T L; Fried-Petersen, H B; Roberts, C P; Sundstrom, S M; Burnett, J L; Angeler, D G; Chaffin, B C; Gunderson, L; Twidwell, D; Allen, C R

    2018-05-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems. Published by Elsevier Ltd.

  15. A novel approach for quantitative harmonization in PET.

    PubMed

    Namías, M; Bradshaw, T; Menezes, V O; Machado, M A D; Jeraj, R

    2018-05-04

    Positron emission tomography (PET) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. The quantitative capabilities of PET imaging are particularly important in the context of monitoring response to treatment, where quantitative changes in tracer uptake could be used as a biomarker of treatment response. Reconstruction algorithms and settings have a significant impact on PET quantification. In this work we introduce a novel harmonization methodology requiring only a simple cylindrical phantom and show that it can match the performance of more complex harmonization approaches based on phantoms with spherical inserts. Resolution and noise measurements from cylindrical phantoms are used to simulate the spherical inserts from NEMA image quality phantoms. An optimization algorithm was used to find the optimal smoothing filters for the simulated NEMA phantom images to identify those that best harmonized the PET scanners. Our methodology was tested on seven different PET models from two manufacturers installed at five institutions. Our methodology is able to predict contrast recovery coefficients (CRCs) from NEMA phantoms with errors within  ±5.2% for CRCmax and  ±3.7% for CRCmean (limits of agreement  =  95%). After applying the proposed harmonization protocol, all the CRC values were within the tolerances from EANM. Quantitative harmonization in compliance with the EARL FDG-PET/CT accreditation program is achieved in a simpler way, without the need of NEMA phantoms. This may lead to simplified scanner harmonization workflows more accessible to smaller institutions.

  16. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    PubMed

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  17. Using the Blended Learning Approach in a Quantitative Literacy Course

    ERIC Educational Resources Information Center

    Botts, Ryan T.; Carter, Lori; Crockett, Catherine

    2018-01-01

    The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…

  18. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Toward a quantitative approach to migrants integration

    NASA Astrophysics Data System (ADS)

    Barra, A.; Contucci, P.

    2010-03-01

    Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.

  20. IWGT report on quantitative approaches to genotoxicity risk ...

    EPA Pesticide Factsheets

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast

  1. An effective approach to quantitative analysis of ternary amino acids in foxtail millet substrate based on terahertz spectroscopy.

    PubMed

    Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong

    2018-04-25

    Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A quantitative microscopic approach to predict local recurrence based on in vivo intraoperative imaging of sarcoma tumor margins

    PubMed Central

    Mueller, Jenna L.; Fu, Henry L.; Mito, Jeffrey K.; Whitley, Melodi J.; Chitalia, Rhea; Erkanli, Alaattin; Dodd, Leslie; Cardona, Diana M.; Geradts, Joseph; Willett, Rebecca M.; Kirsch, David G.; Ramanujam, Nimmi

    2015-01-01

    The goal of resection of soft tissue sarcomas located in the extremity is to preserve limb function while completely excising the tumor with a margin of normal tissue. With surgery alone, one-third of patients with soft tissue sarcoma of the extremity will have local recurrence due to microscopic residual disease in the tumor bed. Currently, a limited number of intraoperative pathology-based techniques are used to assess margin status; however, few have been widely adopted due to sampling error and time constraints. To aid in intraoperative diagnosis, we developed a quantitative optical microscopy toolbox, which includes acriflavine staining, fluorescence microscopy, and analytic techniques called sparse component analysis and circle transform to yield quantitative diagnosis of tumor margins. A series of variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. For comparison, if pathology was used to predict local recurrence in this data set, it would achieve a sensitivity of 29% and a specificity of 71%. These results indicate a robust approach for detecting microscopic residual disease, which is an effective predictor of local recurrence. PMID:25994353

  3. Infusing Quantitative Approaches throughout the Biological Sciences Curriculum

    ERIC Educational Resources Information Center

    Thompson, Katerina V.; Cooke, Todd J.; Fagan, William F.; Gulick, Denny; Levy, Doron; Nelson, Kären C.; Redish, Edward F.; Smith, Robert F.; Presson, Joelle

    2013-01-01

    A major curriculum redesign effort at the University of Maryland is infusing all levels of our undergraduate biological sciences curriculum with increased emphasis on interdisciplinary connections and quantitative approaches. The curriculum development efforts have largely been guided by recommendations in the National Research Council's "Bio…

  4. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2017-12-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  5. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  6. Competency-Based Education: A Quantitative Study of the U.S. Air Force Noncommissioned Officer Academy

    ERIC Educational Resources Information Center

    Houser, Bonnie L.

    2017-01-01

    There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…

  7. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  8. Developing a Research Program Using Qualitative and Quantitative Approaches.

    ERIC Educational Resources Information Center

    Beck, Cheryl Tatano

    1997-01-01

    A research program on postpartum depression is used to illustrate the use of both qualitative and quantitative approaches. The direction of a research program is thus not limited by the type of methods in which a researcher has expertise. (SK)

  9. Development of a genus-specific next generation sequencing approach for sensitive and quantitative determination of the Legionella microbiome in freshwater systems.

    PubMed

    Pereira, Rui P A; Peplies, Jörg; Brettar, Ingrid; Höfle, Manfred G

    2017-03-31

    Next Generation Sequencing (NGS) has revolutionized the analysis of natural and man-made microbial communities by using universal primers for bacteria in a PCR based approach targeting the 16S rRNA gene. In our study we narrowed primer specificity to a single, monophyletic genus because for many questions in microbiology only a specific part of the whole microbiome is of interest. We have chosen the genus Legionella, comprising more than 20 pathogenic species, due to its high relevance for water-based respiratory infections. A new NGS-based approach was designed by sequencing 16S rRNA gene amplicons specific for the genus Legionella using the Illumina MiSeq technology. This approach was validated and applied to a set of representative freshwater samples. Our results revealed that the generated libraries presented a low average raw error rate per base (<0.5%); and substantiated the use of high-fidelity enzymes, such as KAPA HiFi, for increased sequence accuracy and quality. The approach also showed high in situ specificity (>95%) and very good repeatability. Only in samples in which the gammabacterial clade SAR86 was present more than 1% non-Legionella sequences were observed. Next-generation sequencing read counts did not reveal considerable amplification/sequencing biases and showed a sensitive as well as precise quantification of L. pneumophila along a dilution range using a spiked-in, certified genome standard. The genome standard and a mock community consisting of six different Legionella species demonstrated that the developed NGS approach was quantitative and specific at the level of individual species, including L. pneumophila. The sensitivity of our genus-specific approach was at least one order of magnitude higher compared to the universal NGS approach. Comparison of quantification by real-time PCR showed consistency with the NGS data. Overall, our NGS approach can determine the quantitative abundances of Legionella species, i. e. the complete Legionella

  10. Serum proteome profiling in canine idiopathic dilated cardiomyopathy using TMT-based quantitative proteomics approach.

    PubMed

    Bilić, Petra; Guillemin, Nicolas; Kovačević, Alan; Beer Ljubić, Blanka; Jović, Ines; Galan, Asier; Eckersall, Peter David; Burchmore, Richard; Mrljak, Vladimir

    2018-05-15

    Idiopathic dilated cardiomyopathy (iDCM) is a primary myocardial disorder with an unknown aetiology, characterized by reduced contractility and ventricular dilation of the left or both ventricles. Naturally occurring canine iDCM was used herein to identify serum proteomic signature of the disease compared to the healthy state, providing an insight into underlying mechanisms and revealing proteins with biomarker potential. To achieve this, we used high-throughput label-based quantitative LC-MS/MS proteomics approach and bioinformatics analysis of the in silico inferred interactome protein network created from the initial list of differential proteins. To complement the proteomic analysis, serum biochemical parameters and levels of know biomarkers of cardiac function were measured. Several proteins with biomarker potential were identified, such as inter-alpha-trypsin inhibitor heavy chain H4, microfibril-associated glycoprotein 4 and apolipoprotein A-IV, which were validated using an independent method (Western blotting) and showed high specificity and sensitivity according to the receiver operating characteristic curve analysis. Bioinformatics analysis revealed involvement of different pathways in iDCM, such as complement cascade activation, lipoprotein particles dynamics, elastic fibre formation, GPCR signalling and respiratory electron transport chain. Idiopathic dilated cardiomyopathy is a severe primary myocardial disease of unknown cause, affecting both humans and dogs. This study is a contribution to the canine heart disease research by means of proteomic and bioinformatic state of the art analyses, following similar approach in human iDCM research. Importantly, we used serum as non-invasive and easily accessible biological source of information and contributed to the scarce data on biofluid proteome research on this topic. Bioinformatics analysis revealed biological pathways modulated in canine iDCM with potential of further targeted research. Also, several

  11. Multidimensional NMR approaches towards highly resolved, sensitive and high-throughput quantitative metabolomics.

    PubMed

    Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick

    2017-02-01

    Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  13. Allometric Trajectories and "Stress": A Quantitative Approach.

    PubMed

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  14. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  15. Molecule kernels: a descriptor- and alignment-free quantitative structure-activity relationship approach.

    PubMed

    Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus

    2008-09-01

    Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.

  16. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  18. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  19. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  20. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  1. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  2. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  3. Isobaric Tags for Relative and Absolute Quantification (iTRAQ)-Based Untargeted Quantitative Proteomic Approach To Identify Change of the Plasma Proteins by Salbutamol Abuse in Beef Cattle.

    PubMed

    Zhang, Kai; Tang, Chaohua; Liang, Xiaowei; Zhao, Qingyu; Zhang, Junmin

    2018-01-10

    Salbutamol, a selective β 2 -agonist, endangers the safety of animal products as a result of illegal use in food animals. In this study, an iTRAQ-based untargeted quantitative proteomic approach was applied to screen potential protein biomarkers in plasma of cattle before and after treatment with salbutamol for 21 days. A total of 62 plasma proteins were significantly affected by salbutamol treatment, which can be used as potential biomarkers to screen for the illegal use of salbutamol in beef cattle. Enzyme-linked immunosorbent assay measurements of five selected proteins demonstrated the reliability of iTRAQ-based proteomics in screening of candidate biomarkers among the plasma proteins. The plasma samples collected before and after salbutamol treatment were well-separated by principal component analysis (PCA) using the differentially expressed proteins. These results suggested that an iTRAQ-based untargeted quantitative proteomic strategy combined with PCA pattern recognition methods can discriminate differences in plasma protein profiles collected before and after salbutamol treatment.

  4. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  5. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    PubMed

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  7. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  8. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  9. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (r

  10. Quantitative genetic models of sexual conflict based on interacting phenotypes.

    PubMed

    Moore, Allen J; Pizzari, Tommaso

    2005-05-01

    Evolutionary conflict arises between reproductive partners when alternative reproductive opportunities are available. Sexual conflict can generate sexually antagonistic selection, which mediates sexual selection and intersexual coevolution. However, despite intense interest, the evolutionary implications of sexual conflict remain unresolved. We propose a novel theoretical approach to study the evolution of sexually antagonistic phenotypes based on quantitative genetics and the measure of social selection arising from male-female interactions. We consider the phenotype of one sex as both a genetically influenced evolving trait as well as the (evolving) social environment in which the phenotype of the opposite sex evolves. Several important points emerge from our analysis, including the relationship between direct selection on one sex and indirect effects through selection on the opposite sex. We suggest that the proposed approach may be a valuable tool to complement other theoretical approaches currently used to study sexual conflict. Most importantly, our approach highlights areas where additional empirical data can help clarify the role of sexual conflict in the evolutionary process.

  11. Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.

    PubMed

    Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir

    2013-10-31

    Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. A quantitative approach to measure road network information based on edge diversity

    NASA Astrophysics Data System (ADS)

    Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing

    2015-12-01

    The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.

  13. A Quantitative Approach for Collocating NEON's Sensor-Based Ecological Measurements and in-situ Field Sampling and Observations

    NASA Astrophysics Data System (ADS)

    Zulueta, R. C.; Metzger, S.; Ayres, E.; Luo, H.; Meier, C. L.; Barnett, D.; Sanclements, M.; Elmendorf, S.

    2013-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale research platform currently in development to assess the causes of ecological change and biological responses to change across a projected 30-year timeframe. A suite of standardized sensor-based measurements (i.e., Terrestrial Instrument System (TIS) measurements) and in-situ field sampling and observations (i.e., Terrestrial Observation System (TOS) activities) will be conducted across 20 ecoclimatic domains in the U.S. where NEON is establishing 60 terrestrial research sites. NEON's TIS measurements and TOS activities are designed to observe the temporal and spatial dynamics of key drivers and ecological processes and responses to change within each of the 60 terrestrial research sites. The TIS measurements are non-destructive and designed to provide in-situ, continuous, and areally integrated observations of the surrounding ecosystem and environment, while TOS sampling and observation activities are designed to encompass a hierarchy of measurable biological states and processes including diversity, abundance, phenology, demography, infectious disease prevalence, ecohydrology, and biogeochemistry. To establish valid relationships between these drivers and site-specific responses, two contradicting requirements must be fulfilled: (i) both types of observations shall be representative of the same ecosystem, and (ii) they shall not significantly influence one another. Here we outline the theoretical background and algorithmic process for determining areas of mutual representativeness and exclusion around NEON's TIS measurements and develop a procedure which quantitatively optimizes this trade-off through: (i) quantifying the source area distributions of TIS measurements, (ii) determining the ratio of user-defined impact threshold to effective impact area for different TOS activities, and (iii) determining the range of feasible distances between TIS locations and TOS activities. This approach

  14. Allelic-based gene-gene interaction associated with quantitative traits.

    PubMed

    Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M

    2009-05-01

    Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.

  15. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is

  16. The Impact of Situation-Based Learning to Students’ Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Latifah, T.; Cahya, E.; Suhendra

    2017-09-01

    Nowadays, the usage of quantities can be seen almost everywhere. There has been an increase of quantitative thinking, such as quantitative reasoning and quantitative literacy, within the context of daily life. However, many people today are still not fully equipped with the knowledge of quantitative thinking. There are still a lot of individuals not having enough quantitative skills to perform well within today’s society. Based on this issue, the research aims to improve students’ quantitative literacy in junior high school. The qualitative analysis of written student work and video observations during the experiment reveal that the impact of situation-based learning affects students’ quantitative literacy.

  17. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  18. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  19. The craniocaudal extension of posterolateral approaches and their combination: a quantitative anatomic and clinical analysis.

    PubMed

    Safavi-Abbasi, Sam; de Oliveira, Jean G; Deshmukh, Pushpa; Reis, Cassius V; Brasiliense, Leonardo B C; Crawford, Neil R; Feiz-Erfan, Iman; Spetzler, Robert F; Preul, Mark C

    2010-03-01

    The aim of this study was to describe quantitatively the properties of the posterolateral approaches and their combination. Six silicone-injected cadaveric heads were dissected bilaterally. Quantitative data were generated with the Optotrak 3020 system (Northern Digital, Waterloo, Canada) and Surgiscope (Elekta Instruments, Inc., Atlanta, GA), including key anatomic points on the skull base and brainstem. All parameters were measured after the basic retrosigmoid craniectomy and then after combination with a basic far-lateral extension. The clinical results of 20 patients who underwent a combined retrosigmoid and far-lateral approach were reviewed. The change in accessibility to the lower clivus was greatest after the far-lateral extension (mean change, 43.62 +/- 10.98 mm2; P = .001). Accessibility to the constant landmarks, Meckel's cave, internal auditory meatus, and jugular foramen did not change significantly between the 2 approaches (P > .05). The greatest change in accessibility to soft tissue between the 2 approaches was to the lower brainstem (mean change, 33.88 +/- 5.25 mm2; P = .0001). Total removal was achieved in 75% of the cases. The average postoperative Glasgow Outcome Scale score of patients who underwent the combined retrosigmoid and far-lateral approach improved significantly, compared with the preoperative scores. The combination of the far-lateral and simple retrosigmoid approaches significantly increases the petroclival working area and access to the cranial nerves. However, risk of injury to neurovascular structures and time needed to extend the craniotomy must be weighed against the increased working area and angles of attack.

  20. CRAFT (complete reduction to amplitude frequency table)--robust and time-efficient Bayesian approach for quantitative mixture analysis by NMR.

    PubMed

    Krishnamurthy, Krish

    2013-12-01

    The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Fragment-based quantitative structure-activity relationship (FB-QSAR) for fragment-based drug design.

    PubMed

    Du, Qi-Shi; Huang, Ri-Bo; Wei, Yu-Tuo; Pang, Zong-Wen; Du, Li-Qin; Chou, Kuo-Chen

    2009-01-30

    In cooperation with the fragment-based design a new drug design method, the so-called "fragment-based quantitative structure-activity relationship" (FB-QSAR) is proposed. The essence of the new method is that the molecular framework in a family of drug candidates are divided into several fragments according to their substitutes being investigated. The bioactivities of molecules are correlated with the physicochemical properties of the molecular fragments through two sets of coefficients in the linear free energy equations. One coefficient set is for the physicochemical properties and the other for the weight factors of the molecular fragments. Meanwhile, an iterative double least square (IDLS) technique is developed to solve the two sets of coefficients in a training data set alternately and iteratively. The IDLS technique is a feedback procedure with machine learning ability. The standard Two-dimensional quantitative structure-activity relationship (2D-QSAR) is a special case, in the FB-QSAR, when the whole molecule is treated as one entity. The FB-QSAR approach can remarkably enhance the predictive power and provide more structural insights into rational drug design. As an example, the FB-QSAR is applied to build a predictive model of neuraminidase inhibitors for drug development against H5N1 influenza virus. (c) 2008 Wiley Periodicals, Inc.

  2. A Quantitative Study of Teacher Readiness to Teach School-Based HIV/AIDS Education in Kenyan Primary Schools

    ERIC Educational Resources Information Center

    Lang'at, Edwin K.

    2014-01-01

    Purpose and Method of Study: The purpose of this study was to investigate teachers' self-perceived readiness to teach school-based HIV/AIDS Awareness and Prevention education in Kenyan primary schools based on their knowledge, attitudes and instructional confidence. This research utilized a non-experimental quantitative approach with a…

  3. Systematic assessment of survey scan and MS2-based abundance strategies for label-free quantitative proteomics using high-resolution MS data.

    PubMed

    Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun

    2014-04-04

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.

  4. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  5. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    PubMed

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  6. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  7. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  8. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  10. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    , four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35˚ -40˚ to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in

  11. High resolution quantitative phase imaging of live cells with constrained optimization approach

    NASA Astrophysics Data System (ADS)

    Pandiyan, Vimal Prabhu; Khare, Kedar; John, Renu

    2016-03-01

    Quantitative phase imaging (QPI) aims at studying weakly scattering and absorbing biological specimens with subwavelength accuracy without any external staining mechanisms. Use of a reference beam at an angle is one of the necessary criteria for recording of high resolution holograms in most of the interferometric methods used for quantitative phase imaging. The spatial separation of the dc and twin images is decided by the reference beam angle and Fourier-filtered reconstructed image will have a very poor resolution if hologram is recorded below a minimum reference angle condition. However, it is always inconvenient to have a large reference beam angle while performing high resolution microscopy of live cells and biological specimens with nanometric features. In this paper, we treat reconstruction of digital holographic microscopy images as a constrained optimization problem with smoothness constraint in order to recover only complex object field in hologram plane even with overlapping dc and twin image terms. We solve this optimization problem by gradient descent approach iteratively and the smoothness constraint is implemented by spatial averaging with appropriate size. This approach will give excellent high resolution image recovery compared to Fourier filtering while keeping a very small reference angle. We demonstrate this approach on digital holographic microscopy of live cells by recovering the quantitative phase of live cells from a hologram recorded with nearly zero reference angle.

  12. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    NASA Astrophysics Data System (ADS)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  13. Quantitative DIC microscopy using an off-axis self-interference approach.

    PubMed

    Fu, Dan; Oh, Seungeun; Choi, Wonshik; Yamauchi, Toyohiko; Dorn, August; Yaqoob, Zahid; Dasari, Ramachandra R; Feld, Michael S

    2010-07-15

    Traditional Normarski differential interference contrast (DIC) microscopy is a very powerful method for imaging nonstained biological samples. However, one of its major limitations is the nonquantitative nature of the imaging. To overcome this problem, we developed a quantitative DIC microscopy method based on off-axis sample self-interference. The digital holography algorithm is applied to obtain quantitative phase gradients in orthogonal directions, which leads to a quantitative phase image through a spiral integration of the phase gradients. This method is practically simple to implement on any standard microscope without stringent requirements on polarization optics. Optical sectioning can be obtained through enlarged illumination NA.

  14. Quantitative Outcomes of a One Health approach to Study Global Health Challenges.

    PubMed

    Falzon, Laura C; Lechner, Isabel; Chantziaras, Ilias; Collineau, Lucie; Courcoul, Aurélie; Filippitzi, Maria-Eleni; Laukkanen-Ninios, Riikka; Peroz, Carole; Pinto Ferreira, Jorge; Postma, Merel; Prestmo, Pia G; Phythian, Clare J; Sarno, Eleonora; Vanantwerpen, Gerty; Vergne, Timothée; Grindlay, Douglas J C; Brennan, Marnie L

    2018-03-01

    Having gained momentum in the last decade, the One Health initiative promotes a holistic approach to address complex global health issues. Before recommending its adoption to stakeholders, however, it is paramount to first compile quantitative evidence of the benefit of such an approach. The aim of this scoping review was to identify and summarize primary research that describes monetary and non-monetary outcomes following adoption of a One Health approach. An extensive literature search yielded a total of 42,167 references, of which 85 were included in the final analysis. The top two biotic health issues addressed in these studies were rabies and malaria; the top abiotic health issue was air pollution. Most studies described collaborations between human and animal (n = 42), or human and environmental disciplines (n = 41); commonly reported interventions included vector control and animal vaccination. Monetary outcomes were commonly expressed as cost-benefit or cost-utility ratios; non-monetary outcomes were described using disease frequency or disease burden measurements. The majority of the studies reported positive or partially positive outcomes. This paper illustrates the variety of health challenges that can be addressed using a One Health approach, and provides tangible quantitative measures that can be used to evaluate future implementations of the One Health approach.

  15. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  16. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  17. A spectral approach for the quantitative description of cardiac collagen network from nonlinear optical imaging.

    PubMed

    Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia

    2015-01-01

    The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.

  18. The quantitation of buffering action II. Applications of the formal & general approach.

    PubMed

    Schmitt, Bernhard M

    2005-03-16

    The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances.

  19. Human Rights-Based Approaches to Mental Health

    PubMed Central

    Bradley, Valerie J.; Sahakian, Barbara J.

    2016-01-01

    Abstract The incidence of human rights violations in mental health care across nations has been described as a “global emergency” and an “unresolved global crisis.” The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers. PMID:27781015

  20. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  1. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  2. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  3. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  4. Synthesising quantitative and qualitative research in evidence-based patient information.

    PubMed

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-03-01

    Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and

  5. Synthesising quantitative and qualitative research in evidence‐based patient information

    PubMed Central

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-01-01

    Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A

  6. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    PubMed Central

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-01-01

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine. PMID:28248241

  7. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    PubMed

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  8. Quantitative gene-gene and gene-environment mapping for leaf shape variation using tree-based models.

    PubMed

    Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun

    2017-01-01

    Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  9. Quantitative learning strategies based on word networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  10. Absolute quantitation of intracellular metabolite concentrations by an isotope ratio-based approach

    PubMed Central

    Bennett, Bryson D; Yuan, Jie; Kimball, Elizabeth H; Rabinowitz, Joshua D

    2009-01-01

    This protocol provides a method for quantitating the intracellular concentrations of endogenous metabolites in cultured cells. The cells are grown in stable isotope-labeled media to near-complete isotopic enrichment and then extracted in organic solvent containing unlabeled internal standards in known concentrations. The ratio of endogenous metabolite to internal standard in the extract is determined using mass spectrometry (MS). The product of this ratio and the unlabeled standard amount equals the amount of endogenous metabolite present in the cells. The cellular concentration of the metabolite can then be calculated on the basis of intracellular volume of the extracted cells. The protocol is exemplified using Escherichia coli and primary human fibroblasts fed uniformly with 13C-labeled carbon sources, with detection of 13C-assimilation by liquid chromatography–tandem MS. It enables absolute quantitation of several dozen metabolites over ~1 week of work. PMID:18714298

  11. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  12. Mixed qualitative and quantitative approach for validating an information booklet before total hip arthroplasty.

    PubMed

    Chabaud, Aurore; Eschalier, Bénédicte; Zullian, Myriam; Plan-Paquet, Anne; Aubreton, Sylvie; Saragaglia, Dominique; Descamps, Stéphane; Coudeyre, Emmanuel

    2018-05-01

    Providing patients with validated information before total hip arthroplasty may help lessen discrepancies between patients' expectations and the surgical result. This study sought to validate an information booklet for candidates for hip arthroplasty by using a mixed qualitative and quantitative approach based on a panel of patients and a sample of healthcare professionals. We developed a booklet in accordance with the standard methods and then conducted focus groups to collect the opinions of a sample of multidisciplinary experts involved in the care of patients with hip osteoarthritis. The number of focus groups and experts was determined according to the data saturation principle. A panel of patients awaiting hip arthroplasty or those in the immediate post-operative period assessed the booklet with self-reporting questionnaires (knowledge, beliefs, and expectations) and semi-structured interviews. All experts and both patient groups validated the booklet in terms of content and presentation. Semi-structured interviews were uninformative, especially for post-operative patients. Reading the booklet significantly (P<0.001) improved the knowledge scores in both groups, with no intergroup differences, but did not affect beliefs in either patient group. Only pre-operative patients significantly changed their expectations. Our mixed qualitative and quantitative approach allowed us to validate a booklet for patients awaiting hip arthroplasty, taking into account the opinions of both patients and healthcare professionals. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  13. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    NASA Astrophysics Data System (ADS)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  14. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  15. RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.

    PubMed

    Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado

    2012-01-01

    In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data.

  16. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  17. Classification of cassava genotypes based on qualitative and quantitative data.

    PubMed

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  18. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations

    PubMed Central

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Mortimer, Peter S.

    2017-01-01

    BACKGROUND. Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. METHODS. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy–based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. RESULTS. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). CONCLUSION. VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. FUNDING. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003. PMID

  19. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations.

    PubMed

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Ostergaard, Pia; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Jiang, Xiaoyi; Mortimer, Peter S; Kiefer, Friedemann

    2017-08-17

    Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy-based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003.

  20. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment. © The Author 2015. Published by Oxford University Press on behalf of

  1. Portable smartphone based quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  2. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  3. Left atrial appendage segmentation and quantitative assisted diagnosis of atrial fibrillation based on fusion of temporal-spatial information.

    PubMed

    Jin, Cheng; Feng, Jianjiang; Wang, Lei; Yu, Heng; Liu, Jiang; Lu, Jiwen; Zhou, Jie

    2018-05-01

    In this paper, we present an approach for left atrial appendage (LAA) multi-phase fast segmentation and quantitative assisted diagnosis of atrial fibrillation (AF) based on 4D-CT data. We take full advantage of the temporal dimension information to segment the living, flailed LAA based on a parametric max-flow method and graph-cut approach to build 3-D model of each phase. To assist the diagnosis of AF, we calculate the volumes of 3-D models, and then generate a "volume-phase" curve to calculate the important dynamic metrics: ejection fraction, filling flux, and emptying flux of the LAA's blood by volume. This approach demonstrates more precise results than the conventional approaches that calculate metrics by area, and allows for the quick analysis of LAA-volume pattern changes of in a cardiac cycle. It may also provide insight into the individual differences in the lesions of the LAA. Furthermore, we apply support vector machines (SVMs) to achieve a quantitative auto-diagnosis of the AF by exploiting seven features from volume change ratios of the LAA, and perform multivariate logistic regression analysis for the risk of LAA thrombosis. The 100 cases utilized in this research were taken from the Philips 256-iCT. The experimental results demonstrate that our approach can construct the 3-D LAA geometries robustly compared to manual annotations, and reasonably infer that the LAA undergoes filling, emptying and re-filling, re-emptying in a cardiac cycle. This research provides a potential for exploring various physiological functions of the LAA and quantitatively estimating the risk of stroke in patients with AF. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Quantitative genetic bases of anthocyanin variation in grape (Vitis vinifera L. ssp. sativa) berry: a quantitative trait locus to quantitative trait nucleotide integrated study.

    PubMed

    Fournier-Level, Alexandre; Le Cunff, Loïc; Gomez, Camila; Doligez, Agnès; Ageorges, Agnès; Roux, Catherine; Bertrand, Yves; Souquet, Jean-Marc; Cheynier, Véronique; This, Patrice

    2009-11-01

    The combination of QTL mapping studies of synthetic lines and association mapping studies of natural diversity represents an opportunity to throw light on the genetically based variation of quantitative traits. With the positional information provided through quantitative trait locus (QTL) mapping, which often leads to wide intervals encompassing numerous genes, it is now feasible to directly target candidate genes that are likely to be responsible for the observed variation in completely sequenced genomes and to test their effects through association genetics. This approach was performed in grape, a newly sequenced genome, to decipher the genetic architecture of anthocyanin content. Grapes may be either white or colored, ranging from the lightest pink to the darkest purple tones according to the amount of anthocyanin accumulated in the berry skin, which is a crucial trait for both wine quality and human nutrition. Although the determinism of the white phenotype has been fully identified, the genetic bases of the quantitative variation of anthocyanin content in berry skin remain unclear. A single QTL responsible for up to 62% of the variation in the anthocyanin content was mapped on a Syrah x Grenache F(1) pseudo-testcross. Among the 68 unigenes identified in the grape genome within the QTL interval, a cluster of four Myb-type genes was selected on the basis of physiological evidence (VvMybA1, VvMybA2, VvMybA3, and VvMybA4). From a core collection of natural resources (141 individuals), 32 polymorphisms revealed significant association, and extended linkage disequilibrium was observed. Using a multivariate regression method, we demonstrated that five polymorphisms in VvMybA genes except VvMybA4 (one retrotransposon, three single nucleotide polymorphisms and one 2-bp insertion/deletion) accounted for 84% of the observed variation. All these polymorphisms led to either structural changes in the MYB proteins or differences in the VvMybAs promoters. We concluded that

  5. Theoretical foundations for a quantitative approach to paleogenetics. I, II.

    NASA Technical Reports Server (NTRS)

    Holmquist, R.

    1972-01-01

    It is shown that by neglecting the phenomena of multiple hits, back mutation, and chance coincidence errors larger than 100% can be introduced in the calculated value of the average number of nucleotide base differences to be expected between two homologous polynucleotides. Mathematical formulas are derived to correct quantitatively for these effects. It is pointed out that the effects change materially the quantitative aspects of phylogenics, such as the length of the legs of the trees. A number of problems are solved without approximation.-

  6. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  7. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  8. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  9. Comparative Effectiveness of Context-Based and Traditional Approaches in Teaching Genetics: Student Views and Achievement

    ERIC Educational Resources Information Center

    Kazeni, Monde; Onwu, Gilbert

    2013-01-01

    The study aimed to determine the comparative effectiveness of context-based and traditional teaching approaches in enhancing student achievement in genetics, problem-solving, science inquiry and decision-making skills, and attitude towards the study of life sciences. A mixed method but essentially quantitative research approach involving a…

  10. Quantitative analysis of breast cancer diagnosis using a probabilistic modelling approach.

    PubMed

    Liu, Shuo; Zeng, Jinshu; Gong, Huizhou; Yang, Hongqin; Zhai, Jia; Cao, Yi; Liu, Junxiu; Luo, Yuling; Li, Yuhua; Maguire, Liam; Ding, Xuemei

    2018-01-01

    Breast cancer is the most prevalent cancer in women in most countries of the world. Many computer-aided diagnostic methods have been proposed, but there are few studies on quantitative discovery of probabilistic dependencies among breast cancer data features and identification of the contribution of each feature to breast cancer diagnosis. This study aims to fill this void by utilizing a Bayesian network (BN) modelling approach. A K2 learning algorithm and statistical computation methods are used to construct BN structure and assess the obtained BN model. The data used in this study were collected from a clinical ultrasound dataset derived from a Chinese local hospital and a fine-needle aspiration cytology (FNAC) dataset from UCI machine learning repository. Our study suggested that, in terms of ultrasound data, cell shape is the most significant feature for breast cancer diagnosis, and the resistance index presents a strong probabilistic dependency on blood signals. With respect to FNAC data, bare nuclei are the most important discriminating feature of malignant and benign breast tumours, and uniformity of both cell size and cell shape are tightly interdependent. The BN modelling approach can support clinicians in making diagnostic decisions based on the significant features identified by the model, especially when some other features are missing for specific patients. The approach is also applicable to other healthcare data analytics and data modelling for disease diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  12. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  13. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    NASA Astrophysics Data System (ADS)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  14. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  15. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    USGS Publications Warehouse

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  16. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    PubMed

    Beatty, William S; Kesler, Dylan C; Webb, Elisabeth B; Raedeke, Andrew H; Naylor, Luke W; Humburg, Dale D

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  17. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  18. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  19. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  20. Comparison of quantitative myocardial perfusion imaging CT to fluorescent microsphere-based flow from high-resolution cryo-images

    NASA Astrophysics Data System (ADS)

    Eck, Brendan L.; Fahmi, Rachid; Levi, Jacob; Fares, Anas; Wu, Hao; Li, Yuemeng; Vembar, Mani; Dhanantwari, Amar; Bezerra, Hiram G.; Wilson, David L.

    2016-03-01

    Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3+/-19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5+/-28.3mL/min/100g and 78.3+/-25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.

  1. Efficient quantitative assessment of facial paralysis using iris segmentation and active contour-based key points detection with hybrid classifier.

    PubMed

    Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo

    2016-03-12

    Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree

  2. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, A; Palomero-Gallagher, N; Morosan, P; Eickhoff, S B; Kowalski, T; de Vos, K; Amunts, K; Zilles, K

    2005-12-01

    Recent progress in anatomical and functional MRI has revived the demand for a reliable, topographic map of the human cerebral cortex. Till date, interpretations of specific activations found in functional imaging studies and their topographical analysis in a spatial reference system are, often, still based on classical architectonic maps. The most commonly used reference atlas is that of Brodmann and his successors, despite its severe inherent drawbacks. One obvious weakness in traditional, architectural mapping is the subjective nature of localising borders between cortical areas, by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, more objective, quantitative mapping procedures have been established in the past years. The quantification of the neocortical, laminar pattern by defining intensity line profiles across the cortical layers, has a long tradition. During the last years, this method has been extended to enable a reliable, reproducible mapping of the cortex based on image analysis and multivariate statistics. Methodological approaches to such algorithm-based, cortical mapping were published for various architectural modalities. In our contribution, principles of algorithm-based mapping are described for cyto- and receptorarchitecture. In a cytoarchitectural parcellation of the human auditory cortex, using a sliding window procedure, the classical areal pattern of the human superior temporal gyrus was modified by a replacing of Brodmann's areas 41, 42, 22 and parts of area 21, with a novel, more detailed map. An extension and optimisation of the sliding window procedure to the specific requirements of receptorarchitectonic mapping, is also described using the macaque central sulcus and adjacent superior parietal lobule as a second, biologically independent example. Algorithm-based mapping procedures, however, are not limited to these two architectural modalities, but can be applied to all images in

  3. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological

  4. Conspicuous Strategies in Teaching Expressive Writing: A Quantitative Study Comparing Two Approaches to Process Writing

    ERIC Educational Resources Information Center

    Fontenot, Jennifer; Carney, Karen J.; Hansen, Kay

    2015-01-01

    A process-writing approach (BW) with novel concepts was developed by the authors to teach writing to elementary-level students. They believed the BW approach was effective but was particularly effective for special-needs students. Consequently, they decided to quantitatively test these assertions. Instead of testing students taught using the BW…

  5. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative

  6. PCA-based groupwise image registration for quantitative MRI.

    PubMed

    Huizinga, W; Poot, D H J; Guyader, J-M; Klaassen, R; Coolen, B F; van Kranenburg, M; van Geuns, R J M; Uitterdijk, A; Polfliet, M; Vandemeulebroucke, J; Leemans, A; Niessen, W J; Klein, S

    2016-04-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T1 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different acquisition parameters (or at multiple time points after injection of a contrast agent) and by fitting a qMRI signal model to the image intensities. Image registration is often necessary to compensate for misalignments due to subject motion and/or geometric distortions caused by the acquisition. However, large differences in image appearance make accurate image registration challenging. In this work, we propose a groupwise image registration method for compensating misalignment in qMRI. The groupwise formulation of the method eliminates the requirement of choosing a reference image, thus avoiding a registration bias. The method minimizes a cost function that is based on principal component analysis (PCA), exploiting the fact that intensity changes in qMRI can be described by a low-dimensional signal model, but not requiring knowledge on the specific acquisition model. The method was evaluated on 4D CT data of the lungs, and both real and synthetic images of five different qMRI applications: T1 mapping in a porcine heart, combined T1 and T2 mapping in carotid arteries, ADC mapping in the abdomen, diffusion tensor mapping in the brain, and dynamic contrast-enhanced mapping in the abdomen. Each application is based on a different acquisition model. The method is compared to a mutual information-based pairwise registration method and four other state-of-the-art groupwise registration methods. Registration accuracy is evaluated in terms of the precision of the estimated qMRI parameters, overlap of segmented structures, distance between corresponding landmarks, and smoothness of the deformation. In all qMRI applications the proposed method performed better than or equally well as

  7. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and

  8. A powerful approach reveals numerous expression quantitative trait haplotypes in multiple tissues.

    PubMed

    Ying, Dingge; Li, Mulin Jun; Sham, Pak Chung; Li, Miaoxin

    2018-04-26

    Recently many studies showed single nucleotide polymorphisms (SNPs) affect gene expression and contribute to development of complex traits/diseases in a tissue context-dependent manner. However, little is known about haplotype's influence on gene expression and complex traits, which reflects the interaction effect between SNPs. In the present study, we firstly proposed a regulatory region guided eQTL haplotype association analysis approach, and then systematically investigate the expression quantitative trait loci (eQTL) haplotypes in 20 different tissues by the approach. The approach has a powerful design of reducing computational burden by the utilization of regulatory predictions for candidate SNP selection and multiple testing corrections on non-independent haplotypes. The application results in multiple tissues showed that haplotype-based eQTLs not only increased the number of eQTL genes in a tissue specific manner, but were also enriched in loci that associated with complex traits in a tissue-matched manner. In addition, we found that tag SNPs of eQTL haplotypes from whole blood were selectively enriched in certain combination of regulatory elements (e.g. promoters and enhancers) according to predicted chromatin states. In summary, this eQTL haplotype detection approach, together with the application results, shed insights into synergistic effect of sequence variants on gene expression and their susceptibility to complex diseases. The executable application "eHaplo" is implemented in Java and is publicly available at http://grass.cgs.hku.hk/limx/ehaplo/. jonsonfox@gmail.com, limiaoxin@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online.

  9. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  10. Multivariate calibration in Laser-Induced Breakdown Spectroscopy quantitative analysis: The dangers of a 'black box' approach and how to avoid them

    NASA Astrophysics Data System (ADS)

    Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.

    2018-06-01

    The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.

  11. Does Homework Really Matter for College Students in Quantitatively-Based Courses?

    ERIC Educational Resources Information Center

    Young, Nichole; Dollman, Amanda; Angel, N. Faye

    2016-01-01

    This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…

  12. A Ligand-observed Mass Spectrometry Approach Integrated into the Fragment Based Lead Discovery Pipeline

    PubMed Central

    Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing

    2015-01-01

    In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181

  13. Towards quantitative PET/MRI: a review of MR-based attenuation correction techniques.

    PubMed

    Hofmann, Matthias; Pichler, Bernd; Schölkopf, Bernhard; Beyer, Thomas

    2009-03-01

    Positron emission tomography (PET) is a fully quantitative technology for imaging metabolic pathways and dynamic processes in vivo. Attenuation correction of raw PET data is a prerequisite for quantification and is typically based on separate transmission measurements. In PET/CT attenuation correction, however, is performed routinely based on the available CT transmission data. Recently, combined PET/magnetic resonance (MR) has been proposed as a viable alternative to PET/CT. Current concepts of PET/MRI do not include CT-like transmission sources and, therefore, alternative methods of PET attenuation correction must be found. This article reviews existing approaches to MR-based attenuation correction (MR-AC). Most groups have proposed MR-AC algorithms for brain PET studies and more recently also for torso PET/MR imaging. Most MR-AC strategies require the use of complementary MR and transmission images, or morphology templates generated from transmission images. We review and discuss these algorithms and point out challenges for using MR-AC in clinical routine. MR-AC is work-in-progress with potentially promising results from a template-based approach applicable to both brain and torso imaging. While efforts are ongoing in making clinically viable MR-AC fully automatic, further studies are required to realize the potential benefits of MR-based motion compensation and partial volume correction of the PET data.

  14. IWGT report on quantitative approaches to genotoxicity risk assessment I. Methods and metrics for defining exposure-response relationships and points of departure (PoDs)

    EPA Science Inventory

    This report summarizes the discussion, conclusions, and points of consensus of the IWGT Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (QWG) based on a meeting in Foz do Iguaçu, Brazil October 31–November 2, 2013. Topics addressed incl...

  15. Quantitative traits and diversification.

    PubMed

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  16. a Clustering-Based Approach for Evaluation of EO Image Indexing

    NASA Astrophysics Data System (ADS)

    Bahmanyar, R.; Rigoll, G.; Datcu, M.

    2013-09-01

    The volume of Earth Observation data is increasing immensely in order of several Terabytes a day. Therefore, to explore and investigate the content of this huge amount of data, developing more sophisticated Content-Based Information Retrieval (CBIR) systems are highly demanded. These systems should be able to not only discover unknown structures behind the data, but also provide relevant results to the users' queries. Since in any retrieval system the images are processed based on a discrete set of their features (i.e., feature descriptors), study and assessment of the structure of feature space, build by different feature descriptors, is of high importance. In this paper, we introduce a clustering-based approach to study the content of image collections. In our approach, we claim that using both internal and external evaluation of clusters for different feature descriptors, helps to understand the structure of feature space. Moreover, the semantic understanding of users about the images also can be assessed. To validate the performance of our approach, we used an annotated Synthetic Aperture Radar (SAR) image collection. Quantitative results besides the visualization of feature space demonstrate the applicability of our approach.

  17. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE PAGES

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...

    2016-11-28

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  18. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  19. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  20. A new LC-MS based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous system

    PubMed Central

    Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.

    2018-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718

  1. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  2. A Quantitative Mass Spectrometry-based Approach for Identifying Protein Kinase-Clients and Quantifying Kinase Activity

    USDA-ARS?s Scientific Manuscript database

    The Homo sapiens and Arabidopsis thaliana genomes are believed to encode >500 and >1,000 protein kinases, respectively. Despite this abundance, few bona fide kinase-client relationships have been described in detail. Mass spectrometry (MS)-based approaches have been integral to the large-scale mapp...

  3. RGB Color Calibration for Quantitative Image Analysis: The “3D Thin-Plate Spline” Warping Approach

    PubMed Central

    Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado

    2012-01-01

    In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data. PMID:22969337

  4. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  5. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  6. Quality control for quantitative PCR based on amplification compatibility test.

    PubMed

    Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W

    2010-04-01

    Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.

  7. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  8. Redox environment in stem and differentiated cells: A quantitative approach.

    PubMed

    Lyublinskaya, O G; Ivanova, Ju S; Pugovkina, N A; Kozhukharova, I V; Kovaleva, Z V; Shatrova, A N; Aksenov, N D; Zenin, V V; Kaulin, Yu A; Gamaley, I A; Nikolsky, N N

    2017-08-01

    Stem cells are believed to maintain a specific intracellular redox status through a combination of enhanced removal capacity and limited production of ROS. In the present study, we challenge this assumption by developing a quantitative approach for the analysis of the pro- and antioxidant ability of human embryonic stem cells in comparison with their differentiated descendants, as well as adult stem and non-stem cells. Our measurements showed that embryonic stem cells are characterized by low ROS level, low rate of extracellular hydrogen peroxide removal and low threshold for peroxide-induced cytotoxicity. However, biochemical normalization of these parameters to cell volume/protein leads to matching of normalized values in stem and differentiated cells and shows that tested in the present study cells (human embryonic stem cells and their fibroblast-like progenies, adult mesenchymal stem cells, lymphocytes, HeLa) maintain similar intracellular redox status. Based on these observations, we propose to use ROS concentration averaged over the cell volume instead of ROS level as a measure of intracellular redox balance. We show that attempts to use ROS level for comparative analysis of redox status of morphologically different cells could lead to false conclusions. Methods for the assessment of ROS concentration based on flow cytometry analysis with the use of H 2 DCFDA dye and HyPer, genetically encoded probe for hydrogen peroxide, are discussed. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  10. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  12. Deep Learning-Based Noise Reduction Approach to Improve Speech Intelligibility for Cochlear Implant Recipients.

    PubMed

    Lai, Ying-Hui; Tsao, Yu; Lu, Xugang; Chen, Fei; Su, Yu-Ting; Chen, Kuang-Chao; Chen, Yu-Hsuan; Chen, Li-Ching; Po-Hung Li, Lieber; Lee, Chin-Hui

    2018-01-20

    We investigate the clinical effectiveness of a novel deep learning-based noise reduction (NR) approach under noisy conditions with challenging noise types at low signal to noise ratio (SNR) levels for Mandarin-speaking cochlear implant (CI) recipients. The deep learning-based NR approach used in this study consists of two modules: noise classifier (NC) and deep denoising autoencoder (DDAE), thus termed (NC + DDAE). In a series of comprehensive experiments, we conduct qualitative and quantitative analyses on the NC module and the overall NC + DDAE approach. Moreover, we evaluate the speech recognition performance of the NC + DDAE NR and classical single-microphone NR approaches for Mandarin-speaking CI recipients under different noisy conditions. The testing set contains Mandarin sentences corrupted by two types of maskers, two-talker babble noise, and a construction jackhammer noise, at 0 and 5 dB SNR levels. Two conventional NR techniques and the proposed deep learning-based approach are used to process the noisy utterances. We qualitatively compare the NR approaches by the amplitude envelope and spectrogram plots of the processed utterances. Quantitative objective measures include (1) normalized covariance measure to test the intelligibility of the utterances processed by each of the NR approaches; and (2) speech recognition tests conducted by nine Mandarin-speaking CI recipients. These nine CI recipients use their own clinical speech processors during testing. The experimental results of objective evaluation and listening test indicate that under challenging listening conditions, the proposed NC + DDAE NR approach yields higher intelligibility scores than the two compared classical NR techniques, under both matched and mismatched training-testing conditions. When compared to the two well-known conventional NR techniques under challenging listening condition, the proposed NC + DDAE NR approach has superior noise suppression capabilities and gives less distortion

  13. A novel approach to teach the generation of bioelectrical potentials from a descriptive and quantitative perspective.

    PubMed

    Rodriguez-Falces, Javier

    2013-12-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.

  14. The selected reaction monitoring/multiple reaction monitoring-based mass spectrometry approach for the accurate quantitation of proteins: clinical applications in the cardiovascular diseases.

    PubMed

    Gianazza, Erica; Tremoli, Elena; Banfi, Cristina

    2014-12-01

    Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.

  15. Quantitative insights for the design of substrate-based SIRT1 inhibitors.

    PubMed

    Kokkonen, Piia; Mellini, Paolo; Nyrhilä, Olli; Rahnasto-Rilla, Minna; Suuronen, Tiina; Kiviranta, Päivi; Huhtiniemi, Tero; Poso, Antti; Jarho, Elina; Lahtela-Kakkonen, Maija

    2014-08-01

    Sirtuin 1 (SIRT1) is the most studied human sirtuin and it catalyzes the deacetylation reaction of acetylated lysine residues of its target proteins, for example histones. It is a promising drug target in the treatment of age-related diseases, such as neurodegenerative diseases and cancer. In this study, a series of known substrate-based sirtuin inhibitors was analyzed with comparative molecular field analysis (CoMFA), which is a three-dimensional quantitative structure-activity relationships (3D-QSAR) technique. The CoMFA model was validated both internally and externally, producing the statistical values concordance correlation coefficient (CCC) of 0.88, the mean value r(2)m of 0.66 and Q(2)F3 of 0.89. Based on the CoMFA interaction contours, 13 new potential inhibitors with high predicted activity were designed, and the activities were verified by in vitro measurements. This work proposes an effective approach for the design and activity prediction of new potential substrate-based SIRT1 inhibitors. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits

    PubMed Central

    Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling

    2007-01-01

    Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431

  17. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE PAGES

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    2015-12-07

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  18. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  19. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  20. A correlative and quantitative imaging approach enabling characterization of primary cell-cell communication: Case of human CD4+ T cell-macrophage immunological synapses.

    PubMed

    Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie

    2018-05-22

    Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.

  1. A Markov Chain-based quantitative study of angular distribution of photons through turbid slabs via isotropic light scattering

    NASA Astrophysics Data System (ADS)

    Li, Xuesong; Northrop, William F.

    2016-04-01

    This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.

  2. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  3. Modeling and prediction of peptide drift times in ion mobility spectrometry using sequence-based and structure-based approaches.

    PubMed

    Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren

    2011-05-01

    The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  5. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  6. Investigating an approach to the alliance based on interpersonal defense theory.

    PubMed

    Westerman, Michael A; Muran, J Christopher

    2017-09-01

    Notwithstanding consistent findings of significant relationships between the alliance and outcome, questions remain to be answered about the relatively small magnitude of those correlations, the mechanisms underlying the association, and how to conceptualize the alliance construct. We conducted a preliminary study of an approach to the alliance based on interpersonal defense theory, which is an interpersonal reconceptualization of defense processes, to investigate the promise of this alternative approach as a way to address the outstanding issues. We employed qualitative, theory-building case study methodology, closely examining alliance processes at four time points in the treatment of a case in terms of a case formulation based on interpersonal defense theory. The results suggested that our approach made it possible to recognize key processes in the alliance and that it helps explain how the alliance influences outcome. Our analyses also provided a rich set of concrete illustrations of the alliance phenomena identified by the theory. The findings suggest that an approach to the alliance based on interpersonal defense theory holds promise. However, although the qualitative method we employed has advantages, it also has limitations. We offer suggestions about how future qualitative and quantitative investigations could build on this study.

  7. Human Rights-Based Approaches to Mental Health: A Review of Programs.

    PubMed

    Porsdam Mann, Sebastian; Bradley, Valerie J; Sahakian, Barbara J

    2016-06-01

    The incidence of human rights violations in mental health care across nations has been described as a "global emergency" and an "unresolved global crisis." The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers.

  8. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  9. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  10. Tools for Implementing an Evidence-Based Approach in Public Health Practice

    PubMed Central

    Jacobs, Julie A.; Jones, Ellen; Gabella, Barbara A.; Spring, Bonnie

    2012-01-01

    Increasing disease rates, limited funding, and the ever-growing scientific basis for intervention demand the use of proven strategies to improve population health. Public health practitioners must be ready to implement an evidence-based approach in their work to meet health goals and sustain necessary resources. We researched easily accessible and time-efficient tools for implementing an evidence-based public health (EBPH) approach to improve population health. Several tools have been developed to meet EBPH needs, including free online resources in the following topic areas: training and planning tools, US health surveillance, policy tracking and surveillance, systematic reviews and evidence-based guidelines, economic evaluation, and gray literature. Key elements of EBPH are engaging the community in assessment and decision making; using data and information systems systematically; making decisions on the basis of the best available peer-reviewed evidence (both quantitative and qualitative); applying program-planning frameworks (often based in health-behavior theory); conducting sound evaluation; and disseminating what is learned. PMID:22721501

  11. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  12. Quantile-based permutation thresholds for quantitative trait loci hotspots.

    PubMed

    Neto, Elias Chaibub; Keller, Mark P; Broman, Andrew F; Attie, Alan D; Jansen, Ritsert C; Broman, Karl W; Yandell, Brian S

    2012-08-01

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key importance. One approach, randomly allocating observed QTL across the genomic locations separately by trait, implicitly assumes all traits are uncorrelated. Recently, an empirical test for QTL hotspots was proposed on the basis of the number of traits that exceed a predetermined LOD value, such as the standard permutation LOD threshold. The permutation null distribution of the maximum number of traits across all genomic locations preserves the correlation structure among the phenotypes, avoiding the detection of spurious hotspots due to nongenetic correlation induced by uncontrolled environmental factors and unmeasured variables. However, by considering only the number of traits above a threshold, without accounting for the magnitude of the LOD scores, relevant information is lost. In particular, biologically interesting hotspots composed of a moderate to small number of traits with strong LOD scores may be neglected as nonsignificant. In this article we propose a quantile-based permutation approach that simultaneously accounts for the number and the LOD scores of traits within the hotspots. By considering a sliding scale of mapping thresholds, our method can assess the statistical significance of both small and large hotspots. Although the proposed approach can be applied to any type of heritable high-volume "omic" data set, we restrict our attention to expression (e)QTL analysis. We assess and compare the performances of these three methods in simulations and we illustrate how our approach can effectively assess the significance of moderate and small hotspots with strong LOD scores in a yeast expression data set.

  13. The mathematics of a successful deconvolution: a quantitative assessment of mixture-based combinatorial libraries screened against two formylpeptide receptors.

    PubMed

    Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia

    2013-05-30

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.

  14. The Mathematics of a Successful Deconvolution: A Quantitative Assessment of Mixture-Based Combinatorial Libraries Screened Against Two Formylpeptide Receptors

    PubMed Central

    Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia

    2014-01-01

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730

  15. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study.

    PubMed

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R

    2017-01-01

    Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2-5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this parameter. This work has shown the

  17. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study

    PubMed Central

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R.

    2017-01-01

    Purpose Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. Materials and methods 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. Results A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2–5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this

  18. Synthesizing Quantitative Evidence for Evidence-based Nursing: Systematic Review.

    PubMed

    Oh, Eui Geum

    2016-06-01

    As evidence-based practice has become an important issue in healthcare settings, the educational needs for knowledge and skills for the generation and utilization of healthcare evidence are increasing. Systematic review (SR), a way of evidence generation, is a synthesis of primary scientific evidence, which summarizes the best evidence on a specific clinical question using a transparent, a priori protocol driven approach. SR methodology requires a critical appraisal of primary studies, data extraction in a reliable and repeatable way, and examination for validity of the results. SRs are considered hierarchically as the highest form of evidence as they are a systematic search, identification, and summarization of the available evidence to answer a focused clinical question with particular attention to the methodological quality of studies or the credibility of opinion and text. The purpose of this paper is to introduce an overview of the fundamental knowledge, principals and processes in SR. The focus of this paper is on SR especially for the synthesis of quantitative data from primary research studies that examines the effectiveness of healthcare interventions. To activate evidence-based nursing care in various healthcare settings, the best and available scientific evidence are essential components. This paper will include some examples to promote understandings. Copyright © 2016. Published by Elsevier B.V.

  19. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  20. NHS-based Tandem Mass Tagging of Proteins at the Level of Whole Cells: A Critical Evaluation in Comparison to Conventional TMT-Labeling Approaches for Quantitative Proteome Analysis.

    PubMed

    Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara

    2017-01-01

    Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.

  1. Single and two-shot quantitative phase imaging using Hilbert-Huang Transform based fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Trusiak, Maciej; Micó, Vicente; Patorski, Krzysztof; García-Monreal, Javier; Sluzewski, Lukasz; Ferreira, Carlos

    2016-08-01

    In this contribution we propose two Hilbert-Huang Transform based algorithms for fast and accurate single-shot and two-shot quantitative phase imaging applicable in both on-axis and off-axis configurations. In the first scheme a single fringe pattern containing information about biological phase-sample under study is adaptively pre-filtered using empirical mode decomposition based approach. Further it is phase demodulated by the Hilbert Spiral Transform aided by the Principal Component Analysis for the local fringe orientation estimation. Orientation calculation enables closed fringes efficient analysis and can be avoided using arbitrary phase-shifted two-shot Gram-Schmidt Orthonormalization scheme aided by Hilbert-Huang Transform pre-filtering. This two-shot approach is a trade-off between single-frame and temporal phase shifting demodulation. Robustness of the proposed techniques is corroborated using experimental digital holographic microscopy studies of polystyrene micro-beads and red blood cells. Both algorithms compare favorably with the temporal phase shifting scheme which is used as a reference method.

  2. Slow erosion of a quantitative apple resistance to Venturia inaequalis based on an isolate-specific Quantitative Trait Locus.

    PubMed

    Caffier, Valérie; Le Cam, Bruno; Al Rifaï, Mehdi; Bellanger, Marie-Noëlle; Comby, Morgane; Denancé, Caroline; Didelot, Frédérique; Expert, Pascale; Kerdraon, Tifenn; Lemarquand, Arnaud; Ravon, Elisa; Durel, Charles-Eric

    2016-10-01

    Quantitative plant resistance affects the aggressiveness of pathogens and is usually considered more durable than qualitative resistance. However, the efficiency of a quantitative resistance based on an isolate-specific Quantitative Trait Locus (QTL) is expected to decrease over time due to the selection of isolates with a high level of aggressiveness on resistant plants. To test this hypothesis, we surveyed scab incidence over an eight-year period in an orchard planted with susceptible and quantitatively resistant apple genotypes. We sampled 79 Venturia inaequalis isolates from this orchard at three dates and we tested their level of aggressiveness under controlled conditions. Isolates sampled on resistant genotypes triggered higher lesion density and exhibited a higher sporulation rate on apple carrying the resistance allele of the QTL T1 compared to isolates sampled on susceptible genotypes. Due to this ability to select aggressive isolates, we expected the QTL T1 to be non-durable. However, our results showed that the quantitative resistance based on the QTL T1 remained efficient in orchard over an eight-year period, with only a slow decrease in efficiency and no detectable increase of the aggressiveness of fungal isolates over time. We conclude that knowledge on the specificity of a QTL is not sufficient to evaluate its durability. Deciphering molecular mechanisms associated with resistance QTLs, genetic determinants of aggressiveness and putative trade-offs within pathogen populations is needed to help in understanding the erosion processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  4. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  5. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  6. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  7. Establishing the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS): Operationalizing Community-based Research in a Large National Quantitative Study.

    PubMed

    Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela

    2016-08-19

    Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several

  8. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  9. Multiplexed MRM-based quantitation of candidate cancer biomarker proteins in undepleted and non-enriched human plasma.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Borchers, Christoph H

    2013-07-01

    An emerging approach for multiplexed targeted proteomics involves bottom-up LC-MRM-MS, with stable isotope-labeled internal standard peptides, to accurately quantitate panels of putative disease biomarkers in biofluids. In this paper, we used this approach to quantitate 27 candidate cancer-biomarker proteins in human plasma that had not been treated by immunoaffinity depletion or enrichment techniques. These proteins have been reported as biomarkers for a variety of human cancers, from laryngeal to ovarian, with breast cancer having the highest correlation. We implemented measures to minimize the analytical variability, improve the quantitative accuracy, and increase the feasibility and applicability of this MRM-based method. We have demonstrated excellent retention time reproducibility (median interday CV: 0.08%) and signal stability (median interday CV: 4.5% for the analytical platform and 6.1% for the bottom-up workflow) for the 27 biomarker proteins (represented by 57 interference-free peptides). The linear dynamic range for the MRM assays spanned four orders-of-magnitude, with 25 assays covering a 10(3) -10(4) range in protein concentration. The lowest abundance quantifiable protein in our biomarker panel was insulin-like growth factor 1 (calculated concentration: 127 ng/mL). Overall, the analytical performance of this assay demonstrates high robustness and sensitivity, and provides the necessary throughput and multiplexing capabilities required to verify and validate cancer-associated protein biomarker panels in human plasma, prior to clinical use. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Pro-Social Behavior Amongst Students of Tertiary Institutions: An Explorative and a Quantitative Approach

    ERIC Educational Resources Information Center

    Quain, Samuel; Yidana, Xiaaba Dantallah; Ambotumah, Bernard Baba; Mensah-Livivnstone, Ike Joe Nii Annang

    2016-01-01

    The purpose of this paper was to explore antecedents of pro-social behavior amongst university students, using a private university as a case study. Following an explorative research, the study was guided by some theories relating to the phenomenon, focusing on gender and location factors. A quantitative approach was used in the follow up to the…

  11. A quantitative approach to neuropsychiatry: The why and the how.

    PubMed

    Kas, Martien J; Penninx, Brenda; Sommer, Bernd; Serretti, Alessandro; Arango, Celso; Marston, Hugh

    2017-12-12

    The current nosology of neuropsychiatric disorders allows for a pragmatic approach to treatment choice, regulation and clinical research. However, without a biological rationale for these disorders, drug development has stagnated. The recently EU-funded PRISM project aims to develop a quantitative biological approach to the understanding and classification of neuropsychiatric diseases to accelerate the discovery and development of better treatments. By combining clinical data sets from major worldwide disease cohorts and by applying innovative technologies to deeply phenotype stratified patient groups, we will define a set of quantifiable biological parameters for social withdrawal and cognitive deficits common to Schizophrenia (SZ), Major Depression (MD), and Alzheimer's Disease (AD). These studies aim to provide new classification and assessment tools for social and cognitive performance across neuropsychiatric disorders, clinically relevant substrates for treatment development, and predictive, preclinical animal systems. With patients and regulatory agencies, we seek to provide clear routes for the future translation and regulatory approval for new treatments and provide solutions to the growing public health challenges of psychiatry and neurology. Copyright © 2017. Published by Elsevier Ltd.

  12. Quantum dot nanoprobe-based quantitative analysis for prostate cancer (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kang, Benedict J.; Jang, Gun Hyuk; Park, Sungwook; Lee, Kwan Hyi

    2016-09-01

    Prostate cancer causes one of the leading cancer-related deaths among the Caucasian adult males in Europe and the United State of America. However, it has a high recovery rate indicating when a proper treatment is delivered to a patient. There are cases of over- or under-treatments which exacerbate the disease states indicating the importance of proper therapeutic approach depending on stage of the disease. Recognition of the unmet needs has raised a need for stratification of the disease. There have been attempts to stratify based on biomarker expression patterns in the course of disease progression. To closely observe the biomarker expression patterns, we propose the use of quantitative imaging method by using fabricated quantum dot-based nanoprobe to quantify biomarker expression on the surface of prostate cancer cells. To characterize the cell line and analyze the biomarker levels, cluster of differentiation 44 (CD 44), prostate specific membrane antigen (PSMA), and epithelial cell adhesion molecule (EpCAM) are used. Each selected biomarker per cell line has been quantified from which we established a signature of biomarkers of a prostate cancer cell line.

  13. eSIP: A Novel Solution-Based Sectioned Image Property Approach for Microscope Calibration

    PubMed Central

    Butzlaff, Malte; Weigel, Arwed; Ponimaskin, Evgeni; Zeug, Andre

    2015-01-01

    Fluorescence confocal microscopy represents one of the central tools in modern sciences. Correspondingly, a growing amount of research relies on the development of novel microscopic methods. During the last decade numerous microscopic approaches were developed for the investigation of various scientific questions. Thereby, the former qualitative imaging methods became replaced by advanced quantitative methods to gain more and more information from a given sample. However, modern microscope systems being as complex as they are, require very precise and appropriate calibration routines, in particular when quantitative measurements should be compared over longer time scales or between different setups. Multispectral beads with sub-resolution size are often used to describe the point spread function and thus the optical properties of the microscope. More recently, a fluorescent layer was utilized to describe the axial profile for each pixel, which allows a spatially resolved characterization. However, fabrication of a thin fluorescent layer with matching refractive index is technically not solved yet. Therefore, we propose a novel type of calibration concept for sectioned image property (SIP) measurements which is based on fluorescent solution and makes the calibration concept available for a broader number of users. Compared to the previous approach, additional information can be obtained by application of this extended SIP chart approach, including penetration depth, detected number of photons, and illumination profile shape. Furthermore, due to the fit of the complete profile, our method is less susceptible to noise. Generally, the extended SIP approach represents a simple and highly reproducible method, allowing setup independent calibration and alignment procedures, which is mandatory for advanced quantitative microscopy. PMID:26244982

  14. Art or Science? An Evidence-Based Approach to Human Facial Beauty a Quantitative Analysis Towards an Informed Clinical Aesthetic Practice.

    PubMed

    Harrar, Harpal; Myers, Simon; Ghanem, Ali M

    2018-02-01

    Patients often seek guidance from the aesthetic practitioners regarding treatments to enhance their 'beauty'. Is there a science behind the art of assessment and if so is it measurable? Through the centuries, this question has challenged scholars, artists and surgeons. This study aims to undertake a review of the evidence behind quantitative facial measurements in assessing beauty to help the practitioner in everyday aesthetic practice. A Medline, Embase search for beauty, facial features and quantitative analysis was undertaken. Inclusion criteria were studies on adults, and exclusions included studies undertaken for dental, cleft lip, oncology, burns or reconstructive surgeries. The abstracts and papers were appraised, and further studies excluded that were considered inappropriate. The data were extracted using a standardised table. The final dataset was appraised in accordance with the PRISMA checklist and Holland and Rees' critique tools. Of the 1253 studies screened, 1139 were excluded from abstracts and a further 70 excluded from full text articles. The remaining 44 were assessed qualitatively and quantitatively. It became evident that the datasets were not comparable. Nevertheless, common themes were obvious, and these were summarised. Despite measures of the beauty of individual components to the sum of all the parts, such as symmetry and the golden ratio, we are yet far from establishing what truly constitutes quantitative beauty. Perhaps beauty is truly in the 'eyes of the beholder' (and perhaps in the eyes of the subject too). This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  15. Quantitative evaluation of specific vulnerability to nitrate for groundwater resource protection based on process-based simulation model.

    PubMed

    Huan, Huan; Wang, Jinsheng; Zhai, Yuanzheng; Xi, Beidou; Li, Juan; Li, Mingxiao

    2016-04-15

    It has been proved that groundwater vulnerability assessment is an effective tool for groundwater protection. Nowadays, quantitative assessment methods for specific vulnerability are scarce due to limited cognition of complicated contaminant fate and transport processes in the groundwater system. In this paper, process-based simulation model for specific vulnerability to nitrate using 1D flow and solute transport model in the unsaturated vadose zone is presented for groundwater resource protection. For this case study in Jilin City of northeast China, rate constants of denitrification and nitrification as well as adsorption constants of ammonium and nitrate in the vadose zone were acquired by laboratory experiments. The transfer time at the groundwater table t50 was taken as the specific vulnerability indicator. Finally, overall vulnerability was assessed by establishing the relationship between groundwater net recharge, layer thickness and t50. The results suggested that the most vulnerable regions of Jilin City were mainly distributed in the floodplain of Songhua River and Mangniu River. The least vulnerable areas mostly appear in the second terrace and back of the first terrace. The overall area of low, relatively low and moderate vulnerability accounted for 76% of the study area, suggesting the relatively low possibility of suffering nitrate contamination. In addition, the sensitivity analysis showed that the most sensitive factors of specific vulnerability in the vadose zone included the groundwater net recharge rate, physical properties of soil medium and rate constants of nitrate denitrification. By validating the suitability of the process-based simulation model for specific vulnerability and comparing with index-based method by a group of integrated indicators, more realistic and accurate specific vulnerability mapping could be acquired by the process-based simulation model acquiring. In addition, the advantages, disadvantages, constraint conditions and

  16. The Effects of Argumentation Based Science Learning Approach on Creative Thinking Skills of Students

    ERIC Educational Resources Information Center

    Küçük Demir, Betül; Isleyen, Tevfik

    2015-01-01

    The aim of this study is to explore the effects of argumentation-based science learning (ABSL) approach on 9th Grade of Secondary Education students' creative thinking skills. The sample of the study included 22 9th grade of Secondary Education students in Bayburt in 2012-2013 academic year. In this study quantitative research method and…

  17. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  18. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. A resilience-oriented approach for quantitatively assessing recurrent spatial-temporal congestion on urban roads.

    PubMed

    Tang, Junqing; Heinimann, Hans Rudolf

    2018-01-01

    Traffic congestion brings not only delay and inconvenience, but other associated national concerns, such as greenhouse gases, air pollutants, road safety issues and risks. Identification, measurement, tracking, and control of urban recurrent congestion are vital for building a livable and smart community. A considerable amount of works has made contributions to tackle the problem. Several methods, such as time-based approaches and level of service, can be effective for characterizing congestion on urban streets. However, studies with systemic perspectives have been minor in congestion quantification. Resilience, on the other hand, is an emerging concept that focuses on comprehensive systemic performance and characterizes the ability of a system to cope with disturbance and to recover its functionality. In this paper, we symbolized recurrent congestion as internal disturbance and proposed a modified metric inspired by the well-applied "R4" resilience-triangle framework. We constructed the metric with generic dimensions from both resilience engineering and transport science to quantify recurrent congestion based on spatial-temporal traffic patterns and made the comparison with other two approaches in freeway and signal-controlled arterial cases. Results showed that the metric can effectively capture congestion patterns in the study area and provides a quantitative benchmark for comparison. Also, it suggested not only a good comparative performance in measuring strength of proposed metric, but also its capability of considering the discharging process in congestion. The sensitivity tests showed that proposed metric possesses robustness against parameter perturbation in Robustness Range (RR), but the number of identified congestion patterns can be influenced by the existence of ϵ. In addition, the Elasticity Threshold (ET) and the spatial dimension of cell-based platform differ the congestion results significantly on both the detected number and intensity. By tackling

  20. A novel approach for quantitative evaluation of the physicochemical interactions between rough membrane surface and sludge foulants in a submerged membrane bioreactor.

    PubMed

    Lin, Hongjun; Zhang, Meijia; Mei, Rongwu; Chen, Jianrong; Hong, Huachang

    2014-11-01

    This study proposed a novel approach for quantitative evaluation of the physicochemical interactions between a particle and rough surface. The approach adopts the composite Simpson's rule to numerically calculate the double integrals in the surface element integration of these physicochemical interactions. The calculation could be achieved by a MATLAB program based on this approach. This approach was then applied to assess the physicochemical interactions between rough membrane surface and sludge foulants in a submerged membrane bioreactor (MBR). The results showed that, as compared with smooth membrane surface, rough membrane surface had a much lower strength of interactions with sludge foulants. Meanwhile, membrane surface morphology significantly affected the strength and properties of the interactions. This study showed that the newly developed approach was feasible, and could serve as a primary tool for investigating membrane fouling in MBRs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. An Exploration of a Quantitative Reasoning Instructional Approach to Linear Equations in Two Variables with Community College Students

    ERIC Educational Resources Information Center

    Belue, Paul T.; Cavey, Laurie Overman; Kinzel, Margaret T.

    2017-01-01

    In this exploratory study, we examined the effects of a quantitative reasoning instructional approach to linear equations in two variables on community college students' conceptual understanding, procedural fluency, and reasoning ability. This was done in comparison to the use of a traditional procedural approach for instruction on the same topic.…

  2. Gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of Japanese encephalitis virus

    NASA Astrophysics Data System (ADS)

    Huang, Su-Hua; Yang, Tsuey-Ching; Tsai, Ming-Hong; Tsai, I.-Shou; Lu, Huang-Chih; Chuang, Pei-Hsin; Wan, Lei; Lin, Ying-Ju; Lai, Chih-Ho; Lin, Cheng-Wen

    2008-10-01

    Virus isolation and antibody detection are routinely used for diagnosis of Japanese encephalitis virus (JEV) infection, but the low level of transient viremia in some JE patients makes JEV isolation from clinical and surveillance samples very difficult. We describe the use of gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of JEV from its RNA genome. We tested the effect of gold nanoparticles on four different PCR systems, including conventional PCR, reverse-transcription PCR (RT-PCR), and SYBR green real-time PCR and RT-PCR assays for diagnosis in the acute phase of JEV infection. Gold nanoparticles increased the amplification yield of the PCR product and shortened the PCR time compared to the conventional reaction. In addition, nanogold-based real-time RT-PCR showed a linear relationship between Ct and template amount using ten-fold dilutions of JEV. The nanogold-based RT-PCR and real-time quantitative RT-PCR assays were able to detect low levels (1-10 000 copies) of the JEV RNA genomes extracted from culture medium or whole blood, providing early diagnostic tools for the detection of low-level viremia in the acute-phase infection. The assays described here were simple, sensitive, and rapid approaches for detection and quantitation of JEV in tissue cultured samples as well as clinical samples.

  3. pH-regulated formation of side products in the reductive amination approach for differential labeling of peptides in relative quantitative experiments.

    PubMed

    Levi Mortera, Stefano; Dioni, Ilaria; Greco, Viviana; Neri, Cristina; Rovero, Paolo; Urbani, Andrea

    2014-05-01

    Among the most common stable-isotope labeling strategies, the reaction of formaldehyde with peptides in the presence of NaCNBH₃ features many attractive aspects that are conducive to its employment in quantitation experiments in proteomics. Reductive amination, with formaldehyde and d(2)-formaldehyde, is reported to be a fast, easy, and specific reaction, undoubtedly inexpensive if compared with commercially available kits for differential isotope coding. Acetaldehyde and d(4)-acetaldehyde could be employed as well without a substantial increase in terms of cost, and should provide a wider spacing between the differentially tagged peptides in the mass spectrum. Nevertheless, only a single paper reports about a diethylation approach for quantitation. We undertook a systematic analytical investigation on the reductive amination of some standard peptides pointing out the occasional occurrence of side reactions in dependence of pH or reagents order of addition, particularly observing the formation of cyclic adducts ascribable to rearrangements involving the generated Schiff-base and all the nucleophilic sites of its chemical environment. We also tried to evaluate how much this side-products amount may impair isotope coded relative quantitation.

  4. A clustering approach to segmenting users of internet-based risk calculators.

    PubMed

    Harle, C A; Downs, J S; Padman, R

    2011-01-01

    Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.

  5. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    NASA Astrophysics Data System (ADS)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  6. Computational modeling approaches to quantitative structure-binding kinetics relationships in drug discovery.

    PubMed

    De Benedetti, Pier G; Fanelli, Francesca

    2018-03-21

    Simple comparative correlation analyses and quantitative structure-kinetics relationship (QSKR) models highlight the interplay of kinetic rates and binding affinity as an essential feature in drug design and discovery. The choice of the molecular series, and their structural variations, used in QSKR modeling is fundamental to understanding the mechanistic implications of ligand and/or drug-target binding and/or unbinding processes. Here, we discuss the implications of linear correlations between kinetic rates and binding affinity constants and the relevance of the computational approaches to QSKR modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.

    PubMed Central

    Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M

    1998-01-01

    Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897

  8. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  9. A Multiplexed, Probe-Based Quantitative PCR Assay for DNA of Phytophthora sojae

    USDA-ARS?s Scientific Manuscript database

    Phytophthora sojae (Kaufm. & Gerd.) causes seed rot, pre- and post-emergence damping off, and sometimes foliar blight in soybean (Glycine max). Crop loss may approach 100% with susceptible cultivars. We report here the development of a unique quantitative PCR assay specific to DNA of P. sojae, and a...

  10. A Novel Approach to Teach the Generation of Bioelectrical Potentials from a Descriptive and Quantitative Perspective

    ERIC Educational Resources Information Center

    Rodriguez-Falces, Javier

    2013-01-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are…

  11. An Energy-Based Three-Dimensional Segmentation Approach for the Quantitative Interpretation of Electron Tomograms

    PubMed Central

    Bartesaghi, Alberto; Sapiro, Guillermo; Subramaniam, Sriram

    2006-01-01

    Electron tomography allows for the determination of the three-dimensional structures of cells and tissues at resolutions significantly higher than that which is possible with optical microscopy. Electron tomograms contain, in principle, vast amounts of information on the locations and architectures of large numbers of subcellular assemblies and organelles. The development of reliable quantitative approaches for the analysis of features in tomograms is an important problem, and a challenging prospect due to the low signal-to-noise ratios that are inherent to biological electron microscopic images. This is, in part, a consequence of the tremendous complexity of biological specimens. We report on a new method for the automated segmentation of HIV particles and selected cellular compartments in electron tomograms recorded from fixed, plastic-embedded sections derived from HIV-infected human macrophages. Individual features in the tomogram are segmented using a novel robust algorithm that finds their boundaries as global minimal surfaces in a metric space defined by image features. The optimization is carried out in a transformed spherical domain with the center an interior point of the particle of interest, providing a proper setting for the fast and accurate minimization of the segmentation energy. This method provides tools for the semi-automated detection and statistical evaluation of HIV particles at different stages of assembly in the cells and presents opportunities for correlation with biochemical markers of HIV infection. The segmentation algorithm developed here forms the basis of the automated analysis of electron tomograms and will be especially useful given the rapid increases in the rate of data acquisition. It could also enable studies of much larger data sets, such as those which might be obtained from the tomographic analysis of HIV-infected cells from studies of large populations. PMID:16190467

  12. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  13. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015

    PubMed Central

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226

  14. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction

  15. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE PAGES

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...

    2016-06-22

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction

  16. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches

    PubMed Central

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-01-01

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541

  17. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches.

    PubMed

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-04-29

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points.

  18. Prenatal diagnosis of fetuses with increased nuchal translucency using an approach based on quantitative fluorescent polymerase chain reaction and genomic microarray.

    PubMed

    Pan, Min; Han, Jin; Zhen, Li; Yang, Xin; Li, Ru; Liao, Can; Li, Dong-Zhi

    2016-02-01

    To assess the clinical value of prenatal diagnosis of fetuses with increased nuchal translucency (NT) using an approach based on quantitative fluorescent polymerase chain reaction (QF-PCR) and chromosomal microarray (CMA). From January 2013 to October 2014, we included 175 pregnancies with fetal NT ≥ 3.5mm at 11-13 weeks' gestation who received chorionic villus sampling. QF-PCR was first used to rapidly detect common aneuploidies. The cases with a normal QF-PCR result were analyzed by CMA. Of the 175 cases, common aneuploidies were detected by QF-PCR in 53 (30.2%) cases (30 cases of trisomy 21, 12 cases of monosomy X, 7 cases of trisomy 18, 3 cases of trisomy 13 and 1 case of 47, XXY). Among the 122 cases with a normal QF-PCR result, microarray detected additional pathogenic copy number variants (CNVs) in 5.7% (7/122) of cases. Four cases would have expected to be detectable by conventional karyotyping because of large deletions/duplications (>10 Mb), leaving three cases (2.5%; 3/118) with pathogenic CNVs only detectable by CMA. It is rational to use a diagnostic strategy in which CMA is preceded by the less expensive, rapid, QF-PCR to detect common aneuploidies. CMA allows detection of a number of pathogenic chromosomal aberrations in fetuses with a high NT. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Quantitative real-time PCR approaches for microbial community studies in wastewater treatment systems: applications and considerations.

    PubMed

    Kim, Jaai; Lim, Juntaek; Lee, Changsoo

    2013-12-01

    Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. [A novel quantitative approach to study dynamic anaerobic process at micro scale].

    PubMed

    Zhang, Zhong-Liang; Wu, Jing; Jiang, Jian-Kai; Jiang, Jie; Li, Huai-Zhi

    2012-11-01

    Anaerobic digestion is attracting more and more interests because of its advantages such as low cost and recovery of clean energy etc. In order to overcome the drawbacks of the existed methods to study the dynamic anaerobic process, a novel microscopical quantitative approach at the granule level was developed combining both the microdevice and the quantitative image analysis techniques. This experiment displayed the process and characteristics of the gas production at static state for the first time and the results indicated that the method was of satisfactory repeatability. The gas production process at static state could be divided into three stages including rapid linear increasing stage, decelerated increasing stage and slow linear increasing stage. The rapid linear increasing stage was long and the biogas rate was high under high initial organic loading rate. The results showed that it was feasible to make the anaerobic process to be carried out in the microdevice; furthermore this novel method was reliable and could clearly display the dynamic process of the anaerobic reaction at the micro scale. The results are helpful to understand the anaerobic process.

  1. Quantitative biomarkers of colonic dysplasia based on intrinsic second-harmonic generation signal

    NASA Astrophysics Data System (ADS)

    Zhuo, Shuangmu; Zhu, Xiaoqin; Wu, Guizhu; Chen, Jianxin; Xie, Shusen

    2011-12-01

    Most colorectal cancers arise from dysplastic lesions, such as adenomatous polyps, and these lesions are difficult to be detected by the current endoscopic screening approaches. Here, we present the use of an intrinsic second-harmonic generation (SHG) signal as a novel means to differentiate between normal and dysplastic human colonic tissues. We find that the SHG signal can quantitatively identify collagen change associated with colonic dysplasia that is indiscernible by conventional pathologic techniques. By comparing normal with dysplastic mucosa, there were significant differences in collagen density and collagen fiber direction, providing substantial potential to become quantitative intrinsic biomarkers for in vivo clinical diagnosis of colonic dysplasia.

  2. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    NASA Astrophysics Data System (ADS)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  3. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  4. Integrated genomics and molecular breeding approaches for dissecting the complex quantitative traits in crop plants.

    PubMed

    Kujur, Alice; Saxena, Maneesha S; Bajaj, Deepak; Laxmi; Parida, Swarup K

    2013-12-01

    The enormous population growth, climate change and global warming are now considered major threats to agriculture and world's food security. To improve the productivity and sustainability of agriculture, the development of highyielding and durable abiotic and biotic stress-tolerant cultivars and/climate resilient crops is essential. Henceforth, understanding the molecular mechanism and dissection of complex quantitative yield and stress tolerance traits is the prime objective in current agricultural biotechnology research. In recent years, tremendous progress has been made in plant genomics and molecular breeding research pertaining to conventional and next-generation whole genome, transcriptome and epigenome sequencing efforts, generation of huge genomic, transcriptomic and epigenomic resources and development of modern genomics-assisted breeding approaches in diverse crop genotypes with contrasting yield and abiotic stress tolerance traits. Unfortunately, the detailed molecular mechanism and gene regulatory networks controlling such complex quantitative traits is not yet well understood in crop plants. Therefore, we propose an integrated strategies involving available enormous and diverse traditional and modern -omics (structural, functional, comparative and epigenomics) approaches/resources and genomics-assisted breeding methods which agricultural biotechnologist can adopt/utilize to dissect and decode the molecular and gene regulatory networks involved in the complex quantitative yield and stress tolerance traits in crop plants. This would provide clues and much needed inputs for rapid selection of novel functionally relevant molecular tags regulating such complex traits to expedite traditional and modern marker-assisted genetic enhancement studies in target crop species for developing high-yielding stress-tolerant varieties.

  5. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    NASA Astrophysics Data System (ADS)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  6. Modular Classification of Endoscopic Endonasal Transsphenoidal Approaches to Sellar Region: Anatomic Quantitative Study.

    PubMed

    Belotti, Francesco; Doglietto, Francesco; Schreiber, Alberto; Ravanelli, Marco; Ferrari, Marco; Lancini, Davide; Rampinelli, Vittorio; Hirtler, Lena; Buffoli, Barbara; Bolzoni Villaret, Andrea; Maroldi, Roberto; Rodella, Luigi Fabrizio; Nicolai, Piero; Fontanella, Marco Maria

    2018-01-01

    Endoscopic visualization does not necessarily correspond to an adequate working space. The need for balancing invasiveness and adequacy of sellar tumor exposure has recently led to the description of multiple endoscopic endonasal transsphenoidal approaches. Comparative anatomic data on these variants are lacking. We sought to quantitatively compare endoscopic endonasal transsphenoidal approaches to the sella and parasellar region, using the concept of "surgical pyramid." Four endoscopic transsphenoidal approaches were performed in 10 injected specimens: 1) hemisphenoidotomy; 2) transrostral; 3) extended transrostral (with superior turbinectomy); and 4) extended transrostral with posterior ethmoidectomy. ApproachViewer software (part of GTx-Eyes II, University Health Network, Toronto, Canada) with a dedicated navigation system was used to quantify the surgical pyramid volume, as well as exposure of sellar and parasellar areas. Statistical analyses were performed with Friedman's tests and Nemenyi's procedure. Hemisphenoidotomy provided limited exposure of the sellar area and a small working volume. A transrostral approach was necessary to expose the entire sella. Exposure of lateral parasellar areas required superior turbinectomy or posterior ethmoidectomy. The differences between each of the modules was statistically significant. The present study validates, from an anatomic point of view, a modular classification of endoscopic endonasal transsphenoidal approaches to the sellar region. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Validation of Greyscale-Based Quantitative Ultrasound in Manual Wheelchair Users

    PubMed Central

    Collinger, Jennifer L.; Fullerton, Bradley; Impink, Bradley G.; Koontz, Alicia M.; Boninger, Michael L.

    2010-01-01

    Objective The primary aim of this study is to establish the validity of greyscale-based quantitative ultrasound (QUS) measures of the biceps and supraspinatus tendons. Design Nine QUS measures of the biceps and supraspinatus tendons were computed from ultrasound images collected from sixty-seven manual wheelchair users. Shoulder pathology was measured using questionnaires, physical examination maneuvers, and a clinical ultrasound grading scale. Results Increased age, duration of wheelchair use, and body mass correlated with a darker, more homogenous tendon appearance. Subjects with pain during physical examination tests for biceps tenderness and acromioclavicular joint tenderness exhibited significantly different supraspinatus QUS values. Even when controlling for tendon depth, QUS measures of the biceps tendon differed significantly between subjects with healthy tendons, mild tendinosis, and severe tendinosis. Clinical grading of supraspinatus tendon health was correlated with QUS measures of the supraspinatus tendon. Conclusions Quantitative ultrasound is valid method to quantify tendinopathy and may allow for early detection of tendinosis. Manual wheelchair users are at a high risk for developing shoulder tendon pathology and may benefit from quantitative ultrasound-based research that focuses on identifying interventions designed to reduce this risk. PMID:20407304

  8. Qualitative and Quantitative Approaches to the Study of Poverty: Taming the Tensions and Appreciating the Complementarities

    ERIC Educational Resources Information Center

    Balarabe Kura, Sulaiman Y.

    2012-01-01

    There is a germane relationship between qualitative and quantitative approaches to social science research. The relationship is empirically and theoretically demonstrated by poverty researchers. The study of poverty, as argued in this article, is a study of both numbers and contextualities. This article provides a general overview of qualitative…

  9. Analysis of mathematical literacy ability based on self-efficacy in model eliciting activities using metaphorical thinking approach

    NASA Astrophysics Data System (ADS)

    Setiani, C.; Waluya, S. B.; Wardono

    2018-03-01

    The purposes of this research are: (1) to identify learning quality in Model Eliciting Activities (MEAs) using a Metaphorical Thinking (MT) approach regarding qualitative and quantitative; (2) to analyze mathematical literacy of students based on Self-Efficacy (SE). This research is mixed method concurrent embedded design with qualitative research as the primary method. The quantitative research used quasi-experimental with non-equivalent control group design. The population is VIII grade students of SMP Negeri 3 Semarang Indonesia. Quantitative data is examined by conducting completeness mean test, standard completeness test, mean differentiation test and proportional differentiation test. Qualitative data is analyzed descriptively. The result of this research shows that MEAs learning using MT approach accomplishes good criteria both quantitatively and qualitatively. Students with low self-efficacy can identify problems, but they are lack ability to arrange problem-solving strategy on mathematical literacy questions. Students with medium self-efficacy can identify information provided in issues, but they find difficulties to use math symbols in making a representation. Students with high self-efficacy are excellent to represent problems into mathematical models as well as figures by using appropriate symbols and tools, so they can arrange strategy easily to solve mathematical literacy questions.

  10. Liquid chromatography-mass spectrometry-based quantitative proteomics.

    PubMed

    Linscheid, Michael W; Ahrends, Robert; Pieper, Stefan; Kühn, Andreas

    2009-01-01

    During the last decades, molecular sciences revolutionized biomedical research and gave rise to the biotechnology industry. During the next decades, the application of the quantitative sciences--informatics, physics, chemistry, and engineering--to biomedical research brings about the next revolution that will improve human healthcare and certainly create new technologies, since there is no doubt that small changes can have great effects. It is not a question of "yes" or "no," but of "how much," to make best use of the medical options we will have. In this context, the development of accurate analytical methods must be considered a cornerstone, since the understanding of biological processes will be impossible without information about the minute changes induced in cells by interactions of cell constituents with all sorts of endogenous and exogenous influences and disturbances. The first quantitative techniques, which were developed, allowed monitoring relative changes only, but they clearly showed the significance of the information obtained. The recent advent of techniques claiming to quantify proteins and peptides not only relative to each other, but also in an absolute fashion, promised another quantum leap, since knowing the absolute amount will allow comparing even unrelated species and the definition of parameters will permit to model biological systems much more accurate than before. To bring these promises to life, several approaches are under development at this point in time and this review is focused on those developments.

  11. Confidence estimation for quantitative photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gröhl, Janek; Kirchner, Thomas; Maier-Hein, Lena

    2018-02-01

    Quantification of photoacoustic (PA) images is one of the major challenges currently being addressed in PA research. Tissue properties can be quantified by correcting the recorded PA signal with an estimation of the corresponding fluence. Fluence estimation itself, however, is an ill-posed inverse problem which usually needs simplifying assumptions to be solved with state-of-the-art methods. These simplifications, as well as noise and artifacts in PA images reduce the accuracy of quantitative PA imaging (PAI). This reduction in accuracy is often localized to image regions where the assumptions do not hold true. This impedes the reconstruction of functional parameters when averaging over entire regions of interest (ROI). Averaging over a subset of voxels with a high accuracy would lead to an improved estimation of such parameters. To achieve this, we propose a novel approach to the local estimation of confidence in quantitative reconstructions of PA images. It makes use of conditional probability densities to estimate confidence intervals alongside the actual quantification. It encapsulates an estimation of the errors introduced by fluence estimation as well as signal noise. We validate the approach using Monte Carlo generated data in combination with a recently introduced machine learning-based approach to quantitative PAI. Our experiments show at least a two-fold improvement in quantification accuracy when evaluating on voxels with high confidence instead of thresholding signal intensity.

  12. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  13. Quantitative Surface Chirality Detection with Sum Frequency Generation Vibrational Spectroscopy: Twin Polarization Angle Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Feng; Xu, Yanyan; Guo, Yuan

    2009-12-27

    Here we report a novel twin polarization angle (TPA) approach in the quantitative chirality detection with the surface sum-frequency generation vibrational spectroscopy (SFG-VS). Generally, the achiral contribution dominates the surface SFG-VS signal, and the pure chiral signal is usually two or three orders of magnitude smaller. Therefore, it has been difficult to make quantitative detection and analysis of the chiral contributions to the surface SFG- VS signal. In the TPA method, by varying together the polarization angles of the incoming visible light and the sum frequency signal at fixed s or p polarization of the incoming infrared beam, the polarizationmore » dependent SFG signal can give not only direct signature of the chiral contribution in the total SFG-VS signal, but also the accurate measurement of the chiral and achiral components in the surface SFG signal. The general description of the TPA method is presented and the experiment test of the TPA approach is also presented for the SFG-VS from the S- and R-limonene chiral liquid surfaces. The most accurate degree of chiral excess values thus obtained for the 2878 cm⁻¹ spectral peak of the S- and R-limonene liquid surfaces are (23.7±0.4)% and ({25.4±1.3)%, respectively.« less

  14. Linking Yeast Gcn5p Catalytic Function and Gene Regulation Using a Quantitative, Graded Dominant Mutant Approach

    PubMed Central

    Lanza, Amanda M.; Blazeck, John J.; Crook, Nathan C.; Alper, Hal S.

    2012-01-01

    Establishing causative links between protein functional domains and global gene regulation is critical for advancements in genetics, biotechnology, disease treatment, and systems biology. This task is challenging for multifunctional proteins when relying on traditional approaches such as gene deletions since they remove all domains simultaneously. Here, we describe a novel approach to extract quantitative, causative links by modulating the expression of a dominant mutant allele to create a function-specific competitive inhibition. Using the yeast histone acetyltransferase Gcn5p as a case study, we demonstrate the utility of this approach and (1) find evidence that Gcn5p is more involved in cell-wide gene repression, instead of the accepted gene activation associated with HATs, (2) identify previously unknown gene targets and interactions for Gcn5p-based acetylation, (3) quantify the strength of some Gcn5p-DNA associations, (4) demonstrate that this approach can be used to correctly identify canonical chromatin modifications, (5) establish the role of acetyltransferase activity on synthetic lethal interactions, and (6) identify new functional classes of genes regulated by Gcn5p acetyltransferase activity—all six of these major conclusions were unattainable by using standard gene knockout studies alone. We recommend that a graded dominant mutant approach be utilized in conjunction with a traditional knockout to study multifunctional proteins and generate higher-resolution data that more accurately probes protein domain function and influence. PMID:22558379

  15. The Advantages and Disadvantages of Using Qualitative and Quantitative Approaches and Methods in Language "Testing and Assessment" Research: A Literature Review

    ERIC Educational Resources Information Center

    Rahman, Md Shidur

    2017-01-01

    The researchers of various disciplines often use qualitative and quantitative research methods and approaches for their studies. Some of these researchers like to be known as qualitative researchers; others like to be regarded as quantitative researchers. The researchers, thus, are sharply polarised; and they involve in a competition of pointing…

  16. A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies.

    PubMed

    Sun, Dali; Hu, Tony Y

    2018-01-15

    Dark-field microscope (DFM) analysis of nanoparticle binding signal is highly useful for a variety of research and biomedical applications, but current applications for nanoparticle quantification rely on expensive DFM systems. The cost, size, limited robustness of these DFMs limits their utility for non-laboratory settings. Most nanoparticle analyses use high-magnification DFM images, which are labor intensive to acquire and subject to operator bias. Low-magnification DFM image capture is faster, but is subject to background from surface artifacts and debris, although image processing can partially compensate for background signal. We thus mated an LED light source, a dark-field condenser and a 20× objective lens with a mobile phone camera to create an inexpensive, portable and robust DFM system suitable for use in non-laboratory conditions. This proof-of-concept mobile DFM device weighs less than 400g and costs less than $2000, but analysis of images captured with this device reveal similar nanoparticle quantitation results to those acquired with a much larger and more expensive desktop DFMM system. Our results suggest that similar devices may be useful for quantification of stable, nanoparticle-based activity and quantitation assays in resource-limited areas where conventional assay approaches are not practical. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Infrared thermography quantitative image processing

    NASA Astrophysics Data System (ADS)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  18. Quantitative analysis of the human AKR family members in cancer cell lines using the mTRAQ/MRM approach.

    PubMed

    Zhang, Shenyan; Wen, Bo; Zhou, Baojin; Yang, Lei; Cha, Chao; Xu, Shaoxing; Qiu, Xuemei; Wang, Quanhui; Sun, Haidan; Lou, Xiaomin; Zi, Jin; Zhang, Yong; Lin, Liang; Liu, Siqi

    2013-05-03

    Members of human aldo-keto reductase (AKR) superfamily have been reported to be involved in cancer progression, whereas the final conclusion is not generally accepted. Herein, we propose a quantitative method to measure human AKR proteins in cells using mTRAQ-based multiple reaction monitoring (MRM). AKR peptides with multiple transitions were carefully selected upon tryptic digestion of the recombinant AKR proteins, while AKR proteins were identified by SDS-PAGE fractionation coupled with LC-MS/MS. Utilizing mTRAQ triplex labeling to produce the derivative peptides, calibration curves were generated using the mixed lysate as background, and no significantly different quantification of AKRs was elicited from the two sets of calibration curves under the mixed and single lysate as background. We employed this approach to quantitatively determine the 6 AKR proteins, AKR1A1, AKR1B1, AKR1B10, AKR1C1/C2, AKR1C3, and AKR1C4, in 7 different cancer cell lines and for the first time to obtain the absolute quantities of all the AKR proteins in each cell. The cluster plot revealed that AKR1A and AKR1B were widely distributed in most cancer cells with relatively stable abundances, whereas AKR1Cs were unevenly detected among these cells with diverse dynamic abundances. The AKR quantitative distribution in different cancer cells, therefore, may assist further exploration toward how the AKR proteins are involved in tumorigenesis.

  19. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.

    PubMed

    Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei

    2017-09-01

    Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  2. Quantitative Approaches to Group Research: Suggestions for Best Practices

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  3. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    Treesearch

    Hui Wang; Mingyue Jiang; Shujun Li; Chung-Yun Hse; Chunde Jin; Fangli Sun; Zhuo Li

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and...

  4. The FAQUIRE Approach: FAst, QUantitative, hIghly Resolved and sEnsitivity Enhanced 1H, 13C Data.

    PubMed

    Farjon, Jonathan; Milande, Clément; Martineau, Estelle; Akoka, Serge; Giraudeau, Patrick

    2018-02-06

    The targeted analysis of metabolites in complex mixtures is a challenging issue. NMR is one of the major tools in this field, but there is a strong need for more sensitive, better-resolved, and faster quantitative methods. In this framework, we introduce the concept of FAst, QUantitative, hIghly Resolved and sEnsitivity enhanced (FAQUIRE) NMR to push forward the limits of metabolite NMR analysis. 2D 1 H, 13 C 2D quantitative maps are promising alternatives for enhancing the spectral resolution but are highly time-consuming because of (i) the intrinsic nature of 2D, (ii) the longer recycling times required for quantitative conditions, and (iii) the higher number of scans needed to reduce the level of detection/quantification to access low concentrated metabolites. To reach this aim, speeding up the recently developed QUantItative Perfected and pUre shifted HSQC (QUIPU HSQC) is an interesting attempt to develop the FAQUIRE concept. Thanks to the combination of spectral aliasing, nonuniform sampling, and variable repetition time, the acquisition time of 2D quantitative maps is reduced by a factor 6 to 9, while conserving a high spectral resolution thanks to a pure shift approach. The analytical potential of the new Quick QUIPU HSQC (Q QUIPU HSQC) is evaluated on a model metabolite sample, and its potential is shown on breast-cell extracts embedding metabolites at millimolar to submillimolar concentrations.

  5. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  6. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  7. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    PubMed

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    NASA Astrophysics Data System (ADS)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  9. The scientific learning approach using multimedia-based maze game to improve learning outcomes

    NASA Astrophysics Data System (ADS)

    Setiawan, Wawan; Hafitriani, Sarah; Prabawa, Harsa Wara

    2016-02-01

    The objective of curriculum 2013 is to improve the quality of education in Indonesia, which leads to improving the quality of learning. The scientific approach and supported empowerment media is one approach as massaged of curriculum 2013. This research aims to design a labyrinth game based multimedia and apply in the scientific learning approach. This study was conducted in one of the Vocational School in Subjects of Computer Network on 2 (two) classes of experimental and control. The method used Mix Method Research (MMR) which combines qualitative in multimedia design, and quantitative in the study of learning impact. The results of a survey showed that the general of vocational students like of network topology material (68%), like multimedia (74%), and in particular, like interactive multimedia games and flash (84%). Multimediabased maze game developed good eligibility based on media and material aspects of each value 840% and 82%. Student learning outcomes as a result of using a scientific approach to learning with a multimediabased labyrinth game increase with an average of gain index about (58%) and higher than conventional multimedia with index average gain of 0.41 (41%). Based on these results the scientific approach to learning by using multimediabased labyrinth game can improve the quality of learning and increase understanding of students. Multimedia of learning based labyrinth game, which developed, got a positive response from the students with a good qualification level (75%).

  10. Multiplexed MRM-Based Protein Quantitation Using Two Different Stable Isotope-Labeled Peptide Isotopologues for Calibration.

    PubMed

    LeBlanc, André; Michaud, Sarah A; Percy, Andrew J; Hardie, Darryl B; Yang, Juncong; Sinclair, Nicholas J; Proudfoot, Jillaine I; Pistawka, Adam; Smith, Derek S; Borchers, Christoph H

    2017-07-07

    When quantifying endogenous plasma proteins for fundamental and biomedical research - as well as for clinical applications - precise, reproducible, and robust assays are required. Targeted detection of peptides in a bottom-up strategy is the most common and precise mass spectrometry-based quantitation approach when combined with the use of stable isotope-labeled peptides. However, when measuring protein in plasma, the unknown endogenous levels prevent the implementation of the best calibration strategies, since no blank matrix is available. Consequently, several alternative calibration strategies are employed by different laboratories. In this study, these methods were compared to a new approach using two different stable isotope-labeled standard (SIS) peptide isotopologues for each endogenous peptide to be quantified, enabling an external calibration curve as well as the quality control samples to be prepared in pooled human plasma without interference from endogenous peptides. This strategy improves the analytical performance of the assay and enables the accuracy of the assay to be monitored, which can also facilitate method development and validation.

  11. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    PubMed

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  12. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    PubMed

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom

  13. Comparative and Quantitative Global Proteomics Approaches: An Overview

    PubMed Central

    Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis

    2013-01-01

    Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403

  14. Quantitative evaluation of muscle synergy models: a single-trial task decoding approach

    PubMed Central

    Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano

    2013-01-01

    Muscle synergies, i.e., invariant coordinated activations of groups of muscles, have been proposed as building blocks that the central nervous system (CNS) uses to construct the patterns of muscle activity utilized for executing movements. Several efficient dimensionality reduction algorithms that extract putative synergies from electromyographic (EMG) signals have been developed. Typically, the quality of synergy decompositions is assessed by computing the Variance Accounted For (VAF). Yet, little is known about the extent to which the combination of those synergies encodes task-discriminating variations of muscle activity in individual trials. To address this question, here we conceive and develop a novel computational framework to evaluate muscle synergy decompositions in task space. Unlike previous methods considering the total variance of muscle patterns (VAF based metrics), our approach focuses on variance discriminating execution of different tasks. The procedure is based on single-trial task decoding from muscle synergy activation features. The task decoding based metric evaluates quantitatively the mapping between synergy recruitment and task identification and automatically determines the minimal number of synergies that captures all the task-discriminating variability in the synergy activations. In this paper, we first validate the method on plausibly simulated EMG datasets. We then show that it can be applied to different types of muscle synergy decomposition and illustrate its applicability to real data by using it for the analysis of EMG recordings during an arm pointing task. We find that time-varying and synchronous synergies with similar number of parameters are equally efficient in task decoding, suggesting that in this experimental paradigm they are equally valid representations of muscle synergies. Overall, these findings stress the effectiveness of the decoding metric in systematically assessing muscle synergy decompositions in task space. PMID

  15. Continuous stacking computational approach based automated microscope slide scanner

    NASA Astrophysics Data System (ADS)

    Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva

    2018-02-01

    Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.

  16. A quantitative approach to painting styles

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Sbrissa, David; da Fontoura Costa, Luciano; Travieso, Gonzalo

    2015-01-01

    This research extends a method previously applied to music and philosophy (Vilson Vieira et al., 2012), representing the evolution of art as a time-series where relations like dialectics are measured quantitatively. For that, a corpus of paintings of 12 well-known artists from baroque and modern art is analyzed. A set of 99 features is extracted and the features which most contributed to the classification of painters are selected. The projection space obtained provides the basis to the analysis of measurements. These quantitative measures underlie revealing observations about the evolution of painting styles, specially when compared with other humanity fields already analyzed: while music evolved along a master-apprentice tradition (high dialectics) and philosophy by opposition, painting presents another pattern: constant increasing skewness, low opposition between members of the same movement and opposition peaks in the transition between movements. Differences between baroque and modern movements are also observed in the projected "painting space": while baroque paintings are presented as an overlapped cluster, the modern paintings present minor overlapping and are disposed more widely in the projection than the baroque counterparts. This finding suggests that baroque painters shared aesthetics while modern painters tend to "break rules" and develop their own style.

  17. Using Psychometric Technology in Educational Assessment: The Case of a Schema-Based Isomorphic Approach to the Automatic Generation of Quantitative Reasoning Items

    ERIC Educational Resources Information Center

    Arendasy, Martin; Sommer, Markus

    2007-01-01

    This article deals with the investigation of the psychometric quality and constructs validity of algebra word problems generated by means of a schema-based version of the automatic min-max approach. Based on review of the research literature in algebra word problem solving and automatic item generation this new approach is introduced as a…

  18. Quantitative detection of bovine and porcine gelatin difference using surface plasmon resonance based biosensor

    NASA Astrophysics Data System (ADS)

    Wardani, Devy P.; Arifin, Muhammad; Suharyadi, Edi; Abraha, Kamsul

    2015-05-01

    Gelatin is a biopolymer derived from collagen that is widely used in food and pharmaceutical products. Due to some religion restrictions and health issues regarding the gelatin consumption which is extracted from certain species, it is necessary to establish a robust, reliable, sensitive and simple quantitative method to detect gelatin from different parent collagen species. To the best of our knowledge, there has not been a gelatin differentiation method based on optical sensor that could detect gelatin from different species quantitatively. Surface plasmon resonance (SPR) based biosensor is known to be a sensitive, simple and label free optical method for detecting biomaterials that is able to do quantitative detection. Therefore, we have utilized SPR-based biosensor to detect the differentiation between bovine and porcine gelatin in various concentration, from 0% to 10% (w/w). Here, we report the ability of SPR-based biosensor to detect difference between both gelatins, its sensitivity toward the gelatin concentration change, its reliability and limit of detection (LOD) and limit of quantification (LOQ) of the sensor. The sensor's LOD and LOQ towards bovine gelatin concentration are 0.38% and 1.26% (w/w), while towards porcine gelatin concentration are 0.66% and 2.20% (w/w), respectively. The results show that SPR-based biosensor is a promising tool for detecting gelatin from different raw materials quantitatively.

  19. A quantitative approach to developing Parkinsonian monkeys (Macaca fascicularis) with intracerebroventricular 1-methyl-4-phenylpyridinium injections.

    PubMed

    Li, Hao; Lei, Xiaoguang; Huang, Baihui; Rizak, Joshua D; Yang, Lichuan; Yang, Shangchuan; Wu, Jing; Lü, Longbao; Wang, Jianhong; Yan, Ting; Li, Hongwei; Wang, Zhengbo; Hu, Yingzhou; Le, Weidong; Deng, Xingli; Li, Jiali; Xu, Lin; Zhang, Baorong; Hu, Xintian

    2015-08-15

    Non-human primate Parkinson's disease (PD) models are essential for PD research. The most extensively used PD monkey models are induced with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). However, the modeling processes of developing PD monkeys cannot be quantitatively controlled with MPTP. Therefore, a new approach to quantitatively develop chronic PD monkey models will help to advance the goals of "reduction, replacement and refinement" in animal experiments. A novel chronic PD monkey models was reported using the intracerebroventricular administration of 1-methyl-4-phenylpyridinium (MPP(+)) in Cynomolgus monkeys (Macaca fascicularis). This approach successfully produced stable and consistent PD monkeys with typical motor symptoms and pathological changes. More importantly, a sigmoidal relationship (Y=8.15801e(-0.245/x); R=0.73) was discovered between PD score (Y) and cumulative dose of MPP(+) (X). This relationship was then used to develop two additional PD monkeys under a specific time schedule (4 weeks), with planned PD scores (7) by controlling the dose and frequency of the MPP(+) administration as an independent validation of the formula. We developed Parkinsonian monkeys within controlled time frames by regulating the accumulated dose of MPP(+) intracerebroventricular administered, while limiting side effects often witnessed in models developed with the peripheral administration of MPTP, makes this model highly suitable for treatment development. This novel approach provides an edge in evaluating the mechanisms of PD pathology associated with environmental toxins and novel treatment approaches as the formula developed provides a "map" to control and predict the modeling processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Simultaneous fingerprint, quantitative analysis and anti-oxidative based screening of components in Rhizoma Smilacis Glabrae using liquid chromatography coupled with Charged Aerosol and Coulometric array Detection.

    PubMed

    Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong

    2017-04-01

    An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Quantitative and qualitative analysis of the working area obtained by endoscope and microscope in pterional and orbitozigomatic approach to the basilar artery bifurcation using computed tomography based frameless stereotaxy: A cadaver study

    PubMed Central

    Filipce, Venko; Ammirati, Mario

    2015-01-01

    Objective: Basilar aneurisms are one of the most complex and challenging pathologies for neurosurgeons to treat. Endoscopy is a recently rediscovered neurosurgical technique that could lend itself well to overcome some of the vascular visualization challenges associated with this pathology. The purpose of this study was to quantify and compare the basilar artery (BA) bifurcation (tip of the basilar) working area afforded by the microscope and the endoscope using different approaches and image guidance. Materials and Methods: We performed a total of 9 dissections, including pterional (PT) and orbitozygomatic (OZ) approaches bilaterally in five whole, fresh cadaver heads. We used computed tomography based image guidance for intraoperative navigation as well as for quantitative measurements. We estimated the working area of the tip of the basilar, using both a rigid endoscope and an operating microscope. Operability was qualitatively assessed by the senior authors. Results: In microscopic exposure, the OZ approach provided greater working area (160 ± 34.3 mm2) compared to the PT approach (129.8 ± 37.6 mm2) (P > 0.05). The working area in both PT and OZ approaches using 0° and 30° endoscopes was larger than the one available using the microscope alone (P < 0.05). In the PT approach, both 0° and 30° endoscopes provided a working area greater than a microscopic OZ approach (P < 0.05) and an area comparable to the OZ endoscopic approach (P > 0.05). Conclusion: Integration of endoscope and microscope in both PT and OZ approaches can provide significantly greater surgical exposure of the BA bifurcation compared to that afforded by the conventional approaches alone. PMID:25972933

  2. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I

  3. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  4. Management Approaches to Stomal and Peristomal Complications: A Narrative Descriptive Study.

    PubMed

    Beitz, Janice M; Colwell, Janice C

    2016-01-01

    The purpose of this study was to identify optimal interventions for selected complications based on WOC nurse experts' judgment/expertise. A cross-sectional quantitative descriptive design with qualitative, narrative-type components was used for this study. Following validation rating of appropriateness of interventions and quantitative rankings of first-, second-, and third-line approaches, participants provided substantive handwritten narrative comments about listed interventions. Comments were organized and prioritized using frequency count. Narrative comments reflected the quantitative rankings of efficacy of approaches. Clinicians offered further specific suggestions regarding product use and progression of care for selected complications. Narrative analysis using descriptive quantitative frequency count supported the rankings of most preferred treatments of selected stomal and peristomal complications. Findings add to the previous research on prioritized approaches and evidence-based practice in ostomy care.

  5. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  6. Optimization of Robust HPLC Method for Quantitation of Ambroxol Hydrochloride and Roxithromycin Using a DoE Approach.

    PubMed

    Patel, Rashmin B; Patel, Nilay M; Patel, Mrunali R; Solanki, Ajay B

    2017-03-01

    The aim of this work was to develop and optimize a robust HPLC method for the separation and quantitation of ambroxol hydrochloride and roxithromycin utilizing Design of Experiment (DoE) approach. The Plackett-Burman design was used to assess the impact of independent variables (concentration of organic phase, mobile phase pH, flow rate and column temperature) on peak resolution, USP tailing and number of plates. A central composite design was utilized to evaluate the main, interaction, and quadratic effects of independent variables on the selected dependent variables. The optimized HPLC method was validated based on ICH Q2R1 guideline and was used to separate and quantify ambroxol hydrochloride and roxithromycin in tablet formulations. The findings showed that DoE approach could be effectively applied to optimize a robust HPLC method for quantification of ambroxol hydrochloride and roxithromycin in tablet formulations. Statistical comparison between results of proposed and reported HPLC method revealed no significant difference; indicating the ability of proposed HPLC method for analysis of ambroxol hydrochloride and roxithromycin in pharmaceutical formulations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Primary Prevention With Statins: ACC/AHA Risk-Based Approach Versus Trial-Based Approaches to Guide Statin Therapy.

    PubMed

    Mortensen, Martin B; Afzal, Shoaib; Nordestgaard, Børge G; Falk, Erling

    2015-12-22

    Guidelines recommend initiating primary prevention for atherosclerotic cardiovascular disease (ASCVD) with statins based on absolute ASCVD risk assessment. Recently, alternative trial-based and hybrid approaches were suggested for statin treatment eligibility. This study compared these approaches in a direct head-to-head fashion in a contemporary population. The study used the CGPS (Copenhagen General Population Study) with 37,892 subjects aged 40 to 75 years recruited in 2003 to 2008, all free of ASCVD, diabetes, and statin use at baseline. Among the population studied, 42% were eligible for statin therapy according to the 2013 American College of Cardiology/American Heart Association (ACC/AHA) risk assessment and cholesterol treatment guidelines approach, versus 56% with the trial-based approach and 21% with the hybrid approach. Among these statin-eligible subjects, the ASCVD event rate per 1,000 person-years was 9.8, 6.8, and 11.2, respectively. The ACC/AHA-recommended absolute risk score was well calibrated around the 7.5% 10-year ASCVD risk treatment threshold and discriminated better than the trial-based or hybrid approaches. Compared with the ACC/AHA risk-based approach, the net reclassification index for eligibility for statin therapy among 40- to 75-year-old subjects from the CGPS was -0.21 for the trial-based approach and -0.13 for the hybrid approach. The clinical performance of the ACC/AHA risk-based approach for primary prevention of ASCVD with statins was superior to the trial-based and hybrid approaches. Our results indicate that the ACC/AHA guidelines will prevent more ASCVD events than the trial-based and hybrid approaches, while treating fewer people compared with the trial-based approach. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  8. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  9. A quantitative approach to assessing the efficacy of occupant protection programs: A case study from Montana.

    PubMed

    Manlove, Kezia; Stanley, Laura; Peck, Alyssa

    2015-10-01

    Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A New Approach for Quantitative Evaluation of Ultrasonic Wave Attenuation in Composites

    NASA Astrophysics Data System (ADS)

    Ni, Qing-Qing; Li, Ran; Xia, Hong

    2017-02-01

    When ultrasonic waves propagate in composite materials, the propagation behaviors result from the combination effects of various factors, such as material anisotropy and viscoelastic property, internal microstructure and defects, incident wave characteristics and interface condition between composite components. It is essential to make it clear how these factors affect the ultrasonic wave propagation and attenuation characteristics, and how they mutually interact on each other. In the present paper, based on a newly developed time-domain finite element analysis code, PZflex, a unique approach for clarifying the detailed influence mechanism of aforementioned factors is proposed, in which each attenuation component can be extracted from the overall attenuation and analyzed respectively. By taking into consideration the interrelation between each individual attenuation component, the variation behaviors of each component and internal dynamic stress distribution against material anisotropy and matrix viscosity are separately and quantitatively evaluated. From the detailed analysis results of each attenuation component, the energy dissipation at interface is a major component in ultrasonic wave attenuation characteristics, which can provide a maximum contribution rate of 68.2 % to the overall attenuation, and each attenuation component is closely related to the material anisotropy and viscoelasticity. The results clarify the correlation between ultrasonic wave propagation characteristics and material viscoelastic properties, which will be useful in the further development of ultrasonic technology in defect detection.

  11. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  12. Quantitative Residual Strain Analyses on Strain Hardened Nickel Based Alloy

    NASA Astrophysics Data System (ADS)

    Yonezawa, Toshio; Maeguchi, Takaharu; Goto, Toru; Juan, Hou

    Many papers have reported about the effects of strain hardening by cold rolling, grinding, welding, etc. on stress corrosion cracking susceptibility of nickel based alloys and austenitic stainless steels for LWR pipings and components. But, the residual strain value due to cold rolling, grinding, welding, etc. is not so quantitatively evaluated.

  13. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  14. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  15. IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.

    ERIC Educational Resources Information Center

    Nadkami, Sanjay M.

    1998-01-01

    Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)

  16. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  17. A Quantitative and Novel Approach to the Prioritization of Zoonotic Diseases in North America: A Public Perspective

    PubMed Central

    Ng, Victoria; Sargeant, Jan M.

    2012-01-01

    Background Zoonoses account for over half of all communicable diseases causing illness in humans. As there are limited resources available for the control and prevention of zoonotic diseases, a framework for their prioritization is necessary to ensure resources are directed into those of highest importance. Although zoonotic outbreaks are a significant burden of disease in North America, the systematic prioritization of zoonoses in this region has not been previously evaluated. Methodology/Principal Findings This study describes the novel use of a well-established quantitative method, conjoint analysis (CA), to identify the relative importance of 21 key characteristics of zoonotic diseases that can be used for their prioritization in Canada and the US. Relative importance weights from the CA were used to develop a point-scoring system to derive a recommended list of zoonoses for prioritization in Canada and the US. Over 1,500 participants from the general public were recruited to complete the online survey (761 from Canada and 778 from the US). Hierarchical Bayes models were fitted to the survey data to derive CA-weighted scores. Scores were applied to 62 zoonotic diseases of public health importance in Canada and the US to rank diseases in order of priority. Conclusions/Significance This was the first study to describe a systematic and quantitative approach to the prioritization of zoonoses in North America involving public participants. We found individuals with no prior knowledge or experience in prioritizing zoonoses were capable of producing meaningful results using CA as a novel quantitative approach to prioritization. More similarities than differences were observed between countries suggesting general agreement in disease prioritization between Canadians and Americans. We demonstrate CA as a potential tool for the prioritization of zoonoses; other prioritization exercises may also consider this approach. PMID:23133639

  18. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  19. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    PubMed

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  20. A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less

  1. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A Quantitative Systems Pharmacology Approach to Infer Pathways Involved in Complex Disease Phenotypes.

    PubMed

    Schurdak, Mark E; Pei, Fen; Lezon, Timothy R; Carlisle, Diane; Friedlander, Robert; Taylor, D Lansing; Stern, Andrew M

    2018-01-01

    Designing effective therapeutic strategies for complex diseases such as cancer and neurodegeneration that involve tissue context-specific interactions among multiple gene products presents a major challenge for precision medicine. Safe and selective pharmacological modulation of individual molecular entities associated with a disease often fails to provide efficacy in the clinic. Thus, development of optimized therapeutic strategies for individual patients with complex diseases requires a more comprehensive, systems-level understanding of disease progression. Quantitative systems pharmacology (QSP) is an approach to drug discovery that integrates computational and experimental methods to understand the molecular pathogenesis of a disease at the systems level more completely. Described here is the chemogenomic component of QSP for the inference of biological pathways involved in the modulation of the disease phenotype. The approach involves testing sets of compounds of diverse mechanisms of action in a disease-relevant phenotypic assay, and using the mechanistic information known for the active compounds, to infer pathways and networks associated with the phenotype. The example used here is for monogenic Huntington's disease (HD), which due to the pleiotropic nature of the mutant phenotype has a complex pathogenesis. The overall approach, however, is applicable to any complex disease.

  3. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  4. Flexible automated approach for quantitative liquid handling of complex biological samples.

    PubMed

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  5. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  6. Developing patient-centered treatment protocols in brain stimulation: a rationale for combining quantitative and qualitative approaches in persons with HIV.

    PubMed

    Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena

    2012-01-01

    This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.

  7. New approach in the treatment of data from an acid-base potentiometric titrationI. Monocomponent systems of monofunctional acids and bases.

    PubMed

    Maslarska, Vania; Tencheva, Jasmina; Budevsky, Omortag

    2003-01-01

    Based on precise analysis of the acid-base equilibrium, a new approach in the treatment of experimental data from a potentiometric titration is proposed. A new general formula giving explicitly the relation V=f([H(+)]) is derived, valid for every acid-base titration, which includes mono- and polyfunctional protolytes and their mixtures. The present study is the first practical application of this formula for the simplest case, the analysis of one monofunctional protolyte. The collected mV data during the titration are converted into pH-values by means of an auto pH-calibration procedure, thus avoiding preliminary preparation of the measuring system. The mentioned pH-calibration method is applicable also in water-organic mixtures and allows the quantitative determination of sparingly soluble substances (particularly pharmaceuticals). The treatment of the data is performed by means of ready-to-use software products, which makes the proposed approach accessible for a wide range of applications.

  8. Density matters: Review of approaches to setting organism-based ballast water discharge standards

    USGS Publications Warehouse

    Lee II,; Frazier,; Ruiz,

    2010-01-01

    As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.

  9. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  10. DRIFTSEL: an R package for detecting signals of natural selection in quantitative traits.

    PubMed

    Karhunen, M; Merilä, J; Leinonen, T; Cano, J M; Ovaskainen, O

    2013-07-01

    Approaches and tools to differentiate between natural selection and genetic drift as causes of population differentiation are of frequent demand in evolutionary biology. Based on the approach of Ovaskainen et al. (2011), we have developed an R package (DRIFTSEL) that can be used to differentiate between stabilizing selection, diversifying selection and random genetic drift as causes of population differentiation in quantitative traits when neutral marker and quantitative genetic data are available. Apart from illustrating the use of this method and the interpretation of results using simulated data, we apply the package on data from three-spined sticklebacks (Gasterosteus aculeatus) to highlight its virtues. DRIFTSEL can also be used to perform usual quantitative genetic analyses in common-garden study designs. © 2013 John Wiley & Sons Ltd.

  11. Quantitative microscopy of the lung: a problem-based approach. Part 1: basic principles of lung stereology.

    PubMed

    Ochs, Matthias; Mühlfeld, Christian

    2013-07-01

    The growing awareness of the importance of accurate morphometry in lung research has recently motivated the publication of guidelines set forth by a combined task force of the American Thoracic Society and the European Respiratory Society (20). This official ATS/ERS Research Policy Statement provides general recommendations on which stereological methods are to be used in quantitative microscopy of the lung. However, to integrate stereology into a particular experimental study design, investigators are left with the problem of how to implement this in practice. Specifically, different animal models of human lung disease require the use of different stereological techniques and may determine the mode of lung fixation, tissue processing, preparation of sections, and other things. Therefore, the present companion articles were designed to allow a short practically oriented introduction into the concepts of design-based stereology (Part 1) and to provide recommendations for choosing the most appropriate methods to investigate a number of important disease models (Part 2). Worked examples with illustrative images will facilitate the practical performance of equivalent analyses. Study algorithms provide comprehensive surveys to ensure that no essential step gets lost during the multistage workflow. Thus, with this review, we hope to close the gap between theory and practice and enhance the use of stereological techniques in pulmonary research.

  12. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  13. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  14. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  15. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  16. Quantitative diagnosis of breast tumors by morphometric classification of microenvironmental myoepithelial cells using a machine learning approach

    PubMed Central

    Yamamoto, Yoichiro; Saito, Akira; Tateishi, Ayako; Shimojo, Hisashi; Kanno, Hiroyuki; Tsuchiya, Shinichi; Ito, Ken-ichi; Cosatto, Eric; Graf, Hans Peter; Moraleda, Rodrigo R.; Eils, Roland; Grabe, Niels

    2017-01-01

    Machine learning systems have recently received increased attention for their broad applications in several fields. In this study, we show for the first time that histological types of breast tumors can be classified using subtle morphological differences of microenvironmental myoepithelial cell nuclei without any direct information about neoplastic tumor cells. We quantitatively measured 11661 nuclei on the four histological types: normal cases, usual ductal hyperplasia and low/high grade ductal carcinoma in situ (DCIS). Using a machine learning system, we succeeded in classifying the four histological types with 90.9% accuracy. Electron microscopy observations suggested that the activity of typical myoepithelial cells in DCIS was lowered. Through these observations as well as meta-analytic database analyses, we developed a paracrine cross-talk-based biological mechanism of DCIS progressing to invasive cancer. Our observations support novel approaches in clinical computational diagnostics as well as in therapy development against progression. PMID:28440283

  17. Post-event reviews: Using a quantitative approach for analysing incident response to demonstrate the value of business continuity programmes and increase planning efficiency.

    PubMed

    Vaidyanathan, Karthik

    2017-01-01

    Business continuity management is often thought of as a proactive planning process for minimising impact from large-scale incidents and disasters. While this is true, and it is critical to plan for the worst, consistently validating plan effectiveness against smaller disruptions can enable an organisation to gain key insights about its business continuity readiness, drive programme improvements, reduce costs and provide an opportunity to quantitatively demonstrate the value of the programme to management. This paper describes a post mortem framework which is used as a continuous improvement mechanism for tracking, reviewing and learning from real-world events at Microsoft Customer Service & Support. This approach was developed and adopted because conducting regular business continuity exercises proved difficult and expensive in a complex and distributed operations environment with high availability requirements. Using a quantitative approach to measure response to incidents, and categorising outcomes based on such responses, enables business continuity teams to provide data-driven insights to leadership, change perceptions of incident root cause, and instil a higher level of confidence towards disaster response readiness and incident management. The scope of the framework discussed here is specific to reviewing and driving improvements from operational incidents. However, the concept can be extended to learning and evolving readiness plans for other types of incidents.

  18. Magnetic Resonance-based Motion Correction for Quantitative PET in Simultaneous PET-MR Imaging.

    PubMed

    Rakvongthai, Yothin; El Fakhri, Georges

    2017-07-01

    Motion degrades image quality and quantitation of PET images, and is an obstacle to quantitative PET imaging. Simultaneous PET-MR offers a tool that can be used for correcting the motion in PET images by using anatomic information from MR imaging acquired concurrently. Motion correction can be performed by transforming a set of reconstructed PET images into the same frame or by incorporating the transformation into the system model and reconstructing the motion-corrected image. Several phantom and patient studies have validated that MR-based motion correction strategies have great promise for quantitative PET imaging in simultaneous PET-MR. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Audiovisual quality estimation of mobile phone video cameras with interpretation-based quality approach

    NASA Astrophysics Data System (ADS)

    Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte

    2007-01-01

    We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.

  20. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water.

    PubMed

    Chung, S H; Cerussi, A E; Merritt, S I; Ruth, J; Tromberg, B J

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R=0.96) with a difference of 1.1+/-0.91 degrees C over a range of 28-48 degrees C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  1. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  2. Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.

    PubMed

    Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi

    2016-06-01

    Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.

  3. Materiality in a Practice-Based Approach

    ERIC Educational Resources Information Center

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  4. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Natural enamel caries in polarized light microscopy: differences in histopathological features derived from a qualitative versus a quantitative approach to interpret enamel birefringence.

    PubMed

    De Medeiros, R C G; Soares, J D; De Sousa, F B

    2012-05-01

    Lesion area measurement of enamel caries using polarized light microscopy (PLM) is currently performed in a large number of studies, but measurements are based mainly on a mislead qualitative interpretation of enamel birefringence in a single immersion medium. Here, five natural enamel caries lesions are analysed by microradiography and in PLM, and the differences in their histopathological features derived from a qualitative versus a quantitative interpretation of enamel birefringence are described. Enamel birefringence in different immersion media (air, water and quinoline) is interpreted by both qualitative and quantitative approaches, the former leading to an underestimation of the depth of enamel caries mainly when the criterion of validating sound enamel as a negatively birefringent area in immersion in water is used (a current common practice in dental research). Procedures to avoid the shortcomings of a qualitative interpretation of enamel birefringence are presented and discussed. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.

  6. Experiencing teaching and learning quantitative reasoning in a project-based context

    NASA Astrophysics Data System (ADS)

    Muir, Tracey; Beswick, Kim; Callingham, Rosemary; Jade, Katara

    2016-12-01

    This paper presents the findings of a small-scale study that investigated the issues and challenges of teaching and learning about quantitative reasoning (QR) within a project-based learning (PjBL) context. Students and teachers were surveyed and interviewed about their experiences of learning and teaching QR in that context in contrast to teaching and learning mathematics in more traditional settings. The grade 9-12 student participants were characterised by a history of disengagement with mathematics and school in general, and the teacher participants were non-mathematics specialist teachers. Both students and teachers were new to the PjBL situation, which resulted in the teaching/learning relationship being a reciprocal one. The findings indicated that students and teachers viewed QR positively, particularly when compared with traditional mathematics teaching, yet tensions were identified for aspects such as implementation of curriculum and integration of relevant mathematics into projects. Both sets of participants identified situations where learning QR was particularly successful, along with concerns or difficulties about integrating QR into project work. The findings have implications for educators, who may need to examine their own approaches to mathematics teaching, particularly in terms of facilitating student engagement with the subject.

  7. Quantitative Analysis of Transnasal Anterior Skull Base Approach: Report of Technology for Intraoperative Assessment of Instrument Motion.

    PubMed

    Berens, Angelique M; Harbison, Richard Alex; Li, Yangming; Bly, Randall A; Aghdasi, Nava; Ferreira, Manuel; Hannaford, Blake; Moe, Kris S

    2017-08-01

    To develop a method to measure intraoperative surgical instrument motion. This model will be applicable to the study of surgical instrument kinematics including surgical training, skill verification, and the development of surgical warning systems that detect aberrant instrument motion that may result in patient injury. We developed an algorithm to automate derivation of surgical instrument kinematics in an endoscopic endonasal skull base surgery model. Surgical instrument motion was recorded during a cadaveric endoscopic transnasal approach to the pituitary using a navigation system modified to record intraoperative time-stamped Euclidian coordinates and Euler angles. Microdebrider tip coordinates and angles were referenced to the cadaver's preoperative computed tomography scan allowing us to assess surgical instrument kinematics over time. A representative cadaveric endoscopic endonasal approach to the pituitary was performed to demonstrate feasibility of our algorithm for deriving surgical instrument kinematics. Technical feasibility of automatically measuring intraoperative surgical instrument motion and deriving kinematics measurements was demonstrated using standard navigation equipment.

  8. Quantitative detection of melamine based on terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojing; Wang, Cuicui; Liu, Shangjian; Zuo, Jian; Zhou, Zihan; Zhang, Cunlin

    2018-01-01

    Melamine is an organic base and a trimer of cyanamide, with a 1, 3, 5-triazine skeleton. It is usually used for the production of plastics, glue and flame retardants. Melamine combines with acid and related compounds to form melamine cyanurate and related crystal structures, which have been implicated as contaminants or biomarkers in protein adulterations by lawbreakers, especially in milk powder. This paper is focused on developing an available method for quantitative detection of melamine in the fields of security inspection and nondestructive testing based on THz-TDS. Terahertz (THz) technology has promising applications for the detection and identification of materials because it exhibits the properties of spectroscopy, good penetration and safety. Terahertz time-domain spectroscopy (THz-TDS) is a key technique that is applied to spectroscopic measurement of materials based on ultrafast femtosecond laser. In this study, the melamine and its mixture with polyethylene powder in different consistence are measured using the transmission THz-TDS. And we obtained the refractive index spectra and the absorption spectrum of different concentrations of melamine on 0.2-2.8THz. In the refractive index spectra, it is obvious to see that decline trend with the decrease of concentration; and in the absorption spectrum, two peaks of melamine at 1.98THz and 2.28THz can be obtained. Based on the experimental result, the absorption coefficient and the consistence of the melamine in the mixture are determined. Finally, methods for quantitative detection of materials in the fields of nondestructive testing and quality control based on THz-TDS have been studied.

  9. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  10. Approaches to setting organism-based ballast water discharge standards

    USGS Publications Warehouse

    Lee, Henry; Reusser, Deborah A.; Frazier, Melanie

    2013-01-01

    As a vector by which foreign species invade coastal and freshwater waterbodies, ballast water discharge from ships is recognized as a major environmental threat. The International Maritime Organization (IMO) drafted an international treaty establishing ballast water discharge standards based on the number of viable organisms per volume of ballast discharge for different organism size classes. Concerns that the IMO standards are not sufficiently protective have initiated several state and national efforts in the United States to develop more stringent standards. We evaluated seven approaches to establishing discharge standards for the >50-μm size class: (1) expert opinion/management consensus, (2) zero detectable living organisms, (3) natural invasion rates, (4) reaction–diffusion models, (5) population viability analysis (PVA) models, (6) per capita invasion probabilities (PCIP), and (7) experimental studies. Because of the difficulty in synthesizing scientific knowledge in an unbiased and transparent fashion, we recommend the use of quantitative models instead of expert opinion. The actual organism concentration associated with a “zero detectable organisms” standard is defined by the statistical rigor of its monitoring program; thus it is not clear whether such a standard is as stringent as other standards. For several reasons, the natural invasion rate, reaction–diffusion, and experimental approaches are not considered suitable for generating discharge standards. PVA models can be used to predict the likelihood of establishment of introduced species but are limited by a lack of population vital rates for species characteristic of ballast water discharges. Until such rates become available, PVA models are better suited to evaluate relative efficiency of proposed standards rather than predicting probabilities of invasion. The PCIP approach, which is based on historical invasion rates at a regional scale, appears to circumvent many of the indicated problems

  11. Quantitative data standardization of X-ray based densitometry methods

    NASA Astrophysics Data System (ADS)

    Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.

    2018-02-01

    In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.

  12. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to

  13. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    NASA Astrophysics Data System (ADS)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2017-12-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  14. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    NASA Astrophysics Data System (ADS)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2018-05-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  15. [Rating scales based on the phenomenological and structural approach].

    PubMed

    Schiltz, L

    2006-01-01

    A current tendency of research in clinical psychology consists in using an integrated quantitative and qualitative methodology. This approach is especially suited to the study of the therapeutic intervention where the researcher is himself part of the situation he is investigating. As to the tools of research, the combination of the semi-structured clinical interview, of psychometric scales and projective tests has proved to be pertinent to describe the multidimensional and fluctuating reality of the therapeutic relationship and the changes induced by it in the two partners. In arts therapeutic research the investigation of the artistic production or of the free expression of people may complete the psychometric and projective tools. The concept of "expressive test" is currently being used to characterise this method. In this context, the development of rating scales, based on the phenomenological and structural or holistic approach allows us making the link between qualitative analysis and quantification, leading to the use of inferential statistics, providing that we remain at the nominal or ordinal level of measurement. We are explaining the principle of construction of these rating scales and we are illustrating our practice with some examples drawn from studies we realized in clinical psychology.

  16. Quantitative verification of the keyhole concept: a comparison of area of exposure in the parasellar region via supraorbital keyhole, frontotemporal pterional, and supraorbital approaches.

    PubMed

    Cheng, Cheng-Mao; Noguchi, Akio; Dogan, Aclan; Anderson, Gregory J; Hsu, Frank P K; McMenomey, Sean O; Delashaw, Johnny B

    2013-02-01

    This study was designed to determine if the "keyhole concept," proposed by Perneczky's group, can be verified quantitatively. Fourteen (3 bilateral and 8 unilateral) sides of embalmed latex-injected cadaveric heads were dissected via 3 sequential craniotomy approaches: supraorbital keyhole, frontotemporal pterional, and supraorbital. Three-dimensional cartesian coordinates were recorded using a stereotactic localizer. The orthocenter of the ipsilateral anterior clinoid process, the posterior clinoid process, and the contralateral anterior clinoid process are expressed as a center point (the apex). Seven vectors project from the apex to their corresponding target points in a radiating manner on the parasellar skull base. Each 2 neighboring vectors border what could be considered a triangle, and the total area of the 7 triangles sharing the same apex was geometrically expressed as the area of exposure in the parasellar region. Values are expressed as the mean ± SD (mm(2)). The total area of exposure was as follows: supraorbital keyhole 1733.1 ± 336.0, pterional 1699.3 ± 361.9, and supraorbital 1691.4 ± 342.4. The area of exposure on the contralateral side was as follows: supraorbital keyhole 602.2 ± 194.7, pterional 595.2 ± 228.0, and supraorbital 553.3 ± 227.2. The supraorbital keyhole skull flap was 2.0 cm(2), and the skull flap size ratio was 1:5:6.5 (supraorbital keyhole/pterional/supraorbital). The area of exposure of the parasellar region through the smaller supraorbital keyhole approach is as adequate as the larger pterional and supraorbital approaches. The keyhole concept can be verified quantitatively as follows: 1) a wide area of exposure on the skull base can be obtained through a small keyhole skull opening, and 2) the side opposite the opening can also be visualized.

  17. Emerging systems biology approaches in nanotoxicology: Towards a mechanism-based understanding of nanomaterial hazard and risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade

    Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less

  18. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    PubMed

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  19. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  20. Effectiveness of a systematic approach to promote intersectoral collaboration in comprehensive school health promotion-a multiple-case study using quantitative and qualitative data.

    PubMed

    Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K

    2015-07-05

    We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.

  1. Fluorescent quenching-based quantitative detection of specific DNA/RNA using a BODIPY® FL-labeled probe or primer

    PubMed Central

    Kurata, Shinya; Kanagawa, Takahiro; Yamada, Kazutaka; Torimura, Masaki; Yokomaku, Toyokazu; Kamagata, Yoichi; Kurane, Ryuichiro

    2001-01-01

    We have developed a simple method for the quantitative detection of specific DNA or RNA molecules based on the finding that BODIPY® FL fluorescence was quenched by its interaction with a uniquely positioned guanine. This approach makes use of an oligonucleotide probe or primer containing a BODIPY® FL-modified cytosine at its 5′-end. When such a probe was hybridized with a target DNA, its fluorescence was quenched by the guanine in the target, complementary to the modified cytosine, and the quench rate was proportional to the amount of target DNA. This widely applicable technique will be used directly with larger samples or in conjunction with the polymerase chain reaction to quantify small DNA samples. PMID:11239011

  2. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  3. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  4. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  5. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  6. Quantitative glycomics.

    PubMed

    Orlando, Ron

    2010-01-01

    The ability to quantitatively determine changes is an essential component of comparative glycomics. Multiple strategies are available by which this can be accomplished. These include label-free approaches and strategies where an isotopic label is incorporated into the glycans prior to analysis. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  7. The influence of HOPE VI neighborhood revitalization on neighborhood-based physical activity: A mixed-methods approach.

    PubMed

    Dulin-Keita, Akilah; Clay, Olivio; Whittaker, Shannon; Hannon, Lonnie; Adams, Ingrid K; Rogers, Michelle; Gans, Kim

    2015-08-01

    This study uses a mixed methods approach to 1) identify surrounding residents' perceived expectations for Housing Opportunities for People Everywhere (HOPE VI) policy on physical activity outcomes and to 2) quantitatively examine the odds of neighborhood-based physical activity pre-/post-HOPE VI in a low socioeconomic status, predominantly African American community in Birmingham, Alabama. To address aim one, we used group concept mapping which is a structured approach for data collection and analyses that produces pictures/maps of ideas. Fifty-eight residents developed statements about potential influences of HOPE VI on neighborhood-based physical activity. In the quantitative study, we examined whether these potential influences increased the odds of neighborhood walking/jogging. We computed block entry logistic regression models with a larger cohort of residents at baseline (n = 184) and six-months (n = 142, 77% retention; n = 120 for all informative variables). We examined perceived neighborhood disorder (perceived neighborhood disorder scale), walkability and aesthetics (Neighborhood Environment Walkability Scale) and HOPE VI-related community safety and safety for physical activity as predictors. During concept mapping, residents generated statements that clustered into three distinct concepts, "Increased Leisure Physical Activity," "Safe Play Areas," and "Generating Health Promoting Resources." The quantitative analyses indicated that changes in neighborhood walkability increased the odds of neighborhood-based physical activity (p = 0.04). When HOPE VI-related safety for physical activity was entered into the model, it was associated with increased odds of physical activity (p = 0.04). Walkability was no longer statistically significant. These results suggest that housing policies that create walkable neighborhoods and that improve perceptions of safety for physical activity may increase neighborhood-based physical activity. However, the longer

  8. Olfaction preservation after removal of large tuberculum sellae meningiomas via a superior interhemispheric approach. A quantitative and qualitative study.

    PubMed

    Hannequin, P; Paviot, A; Chaussy, O; Gilard, V; Cébula, H; Marie, J-P; Proust, F

    2015-10-01

    We present a prospective series of tuberculum sellae meningioma (TSM) resected via a superior interhemispheric (IH) approach in 10 patients who preoperatively and postoperatively underwent extensive olfaction testing using a standardised test battery. This prospective longitudinal study evaluated the olfactory function after TSM resection. The resection was performed via a superior interhemispheric (IH) approach. The quantitative and qualitative analyses of the olfactory function were assessed with the Biolfa(®) olfactory test (at 6 months). Between November 2009 and April 2012, 10 consecutive patients with symptomatic TSM and preserved olfactory function were operated via a superior IH approach. For the self-evaluation criteria of the olfactory function, the mean preoperative visual analog scale score was 8.8. The mean preoperative total quantitative (/27) scored 18.2 ± 6.3 for this cohort of 10 patients. In the postoperative period, the mean total score decreased non-significantly to 15.8 ± 8.8 (Wilcoxon test, P = 0.085). The mean preoperative qualitative score (/8) was 5.5 ± 1.7 and in the postoperative period decreased, non-significantly, to 4.7 ± 2.6 (Wilcoxon test, P = 0.12). The olfactory function was quantitatively and qualitatively preserved in 6 patients (60%), but a postoperative deterioration occurred in 2 (20%) and an anosmia in 2 (20%). Size and invasive characteristics of the meningioma determined the post-surgical deterioration. Olfaction is an important factor of emotional and social life, which needs to be integrated into the challenge regarding the resection of TSM. The risks of nerve damage are reviewed. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  9. Quantitative diagnostics of soft tissue through viscoelastic characterization using time-based instrumented palpation.

    PubMed

    Palacio-Torralba, Javier; Hammer, Steven; Good, Daniel W; Alan McNeill, S; Stewart, Grant D; Reuben, Robert L; Chen, Yuhang

    2015-01-01

    Although palpation has been successfully employed for centuries to assess soft tissue quality, it is a subjective test, and is therefore qualitative and depends on the experience of the practitioner. To reproduce what the medical practitioner feels needs more than a simple quasi-static stiffness measurement. This paper assesses the capacity of dynamic mechanical palpation to measure the changes in viscoelastic properties that soft tissue can exhibit under certain pathological conditions. A diagnostic framework is proposed to measure elastic and viscous behaviors simultaneously using a reduced set of viscoelastic parameters, giving a reliable index for quantitative assessment of tissue quality. The approach is illustrated on prostate models reconstructed from prostate MRI scans. The examples show that the change in viscoelastic time constant between healthy and cancerous tissue is a key index for quantitative diagnostics using point probing. The method is not limited to any particular tissue or material and is therefore useful for tissue where defining a unique time constant is not trivial. The proposed framework of quantitative assessment could become a useful tool in clinical diagnostics for soft tissue. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Quantitative design of emergency monitoring network for river chemical spills based on discrete entropy theory.

    PubMed

    Shi, Bin; Jiang, Jiping; Sivakumar, Bellie; Zheng, Yi; Wang, Peng

    2018-05-01

    Field monitoring strategy is critical for disaster preparedness and watershed emergency environmental management. However, development of such is also highly challenging. Despite the efforts and progress thus far, no definitive guidelines or solutions are available worldwide for quantitatively designing a monitoring network in response to river chemical spill incidents, except general rules based on administrative divisions or arbitrary interpolation on routine monitoring sections. To address this gap, a novel framework for spatial-temporal network design was proposed in this study. The framework combines contaminant transport modelling with discrete entropy theory and spectral analysis. The water quality model was applied to forecast the spatio-temporal distribution of contaminant after spills and then corresponding information transfer indexes (ITIs) and Fourier approximation periodic functions were estimated as critical measures for setting sampling locations and times. The results indicate that the framework can produce scientific preparedness plans of emergency monitoring based on scenario analysis of spill risks as well as rapid design as soon as the incident happened but not prepared. The framework was applied to a hypothetical spill case based on tracer experiment and a real nitrobenzene spill incident case to demonstrate its suitability and effectiveness. The newly-designed temporal-spatial monitoring network captured major pollution information at relatively low costs. It showed obvious benefits for follow-up early-warning and treatment as well as for aftermath recovery and assessment. The underlying drivers of ITIs as well as the limitations and uncertainty of the approach were analyzed based on the case studies. Comparison with existing monitoring network design approaches, management implications, and generalized applicability were also discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Quantitative Analysis of Endocytic Recycling of Membrane Proteins by Monoclonal Antibody-Based Recycling Assays.

    PubMed

    Blagojević Zagorac, Gordana; Mahmutefendić, Hana; Maćešić, Senka; Karleuša, Ljerka; Lučin, Pero

    2017-03-01

    In this report, we present an analysis of several recycling protocols based on labeling of membrane proteins with specific monoclonal antibodies (mAbs). We analyzed recycling of membrane proteins that are internalized by clathrin-dependent endocytosis, represented by the transferrin receptor, and by clathrin-independent endocytosis, represented by the Major Histocompatibility Class I molecules. Cell surface membrane proteins were labeled with mAbs and recycling of mAb:protein complexes was determined by several approaches. Our study demonstrates that direct and indirect detection of recycled mAb:protein complexes at the cell surface underestimate the recycling pool, especially for clathrin-dependent membrane proteins that are rapidly reinternalized after recycling. Recycling protocols based on the capture of recycled mAb:protein complexes require the use of the Alexa Fluor 488 conjugated secondary antibodies or FITC-conjugated secondary antibodies in combination with inhibitors of endosomal acidification and degradation. Finally, protocols based on the capture of recycled proteins that are labeled with Alexa Fluor 488 conjugated primary antibodies and quenching of fluorescence by the anti-Alexa Fluor 488 displayed the same quantitative assessment of recycling as the antibody-capture protocols. J. Cell. Physiol. 232: 463-476, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  13. Experiencing Teaching and Learning Quantitative Reasoning in a Project-Based Context

    ERIC Educational Resources Information Center

    Muir, Tracey; Beswick, Kim; Callingham, Rosemary; Jade, Katara

    2016-01-01

    This paper presents the findings of a small-scale study that investigated the issues and challenges of teaching and learning about quantitative reasoning (QR) within a project-based learning (PjBL) context. Students and teachers were surveyed and interviewed about their experiences of learning and teaching QR in that context in contrast to…

  14. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  15. Quantitative habitability.

    PubMed

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  16. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    PubMed

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. [A quantitative approach to sports training-adapted social determinants concerning sport].

    PubMed

    Alvis-Gómez, Martina K; Neira-Tolosa, Nury A

    2013-01-01

    Identifying and quantitatively analysing social determinants affecting disabled teenagers' inclusion/exclusion in high-performance sports. This was a descriptive cross-sectional study involving 19 12- to 19-year-old athletes suffering physical and sensory disability and 17 staff from the District Institute of Recreation and Sport. Likert-type rating scales were used, based on four analysis categories, i.e. social structure, socio-economic, educational and living condition determinants. Social inequity pervades the national paralympic sports' system. This is because 74 % of individuals only become recognised as sportspeople when they have obtained meritorious results in set competition without appropriate conditions having been previously provided by such paralympic sports institution to enable them to overcome structural and intermediate barriers. The social structure imposed on district-based paralympic sport stigmatises individuals regarding their individual abilities, affects their empowerment and freedom due to the discrimination experienced by disabled teenagers regarding their competitive achievements.

  18. The Vinyl Acetate Content of Packaging Film: A Quantitative Infrared Experiment.

    ERIC Educational Resources Information Center

    Allpress, K. N.; And Others

    1981-01-01

    Presents an experiment used in laboratory technician training courses to illustrate the quantitative use of infrared spectroscopy which is based on industrial and laboratory procedures for the determination of vinyl acetate levels in ethylene vinyl acetate packaging films. Includes three approaches to allow for varying path lengths (film…

  19. Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.

    PubMed

    Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao

    2018-04-02

    Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.

  20. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  1. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  2. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  3. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  4. Establishment of the purity values of carbohydrate certified reference materials using quantitative nuclear magnetic resonance and mass balance approach.

    PubMed

    Quan, Can

    2014-06-15

    This work described the assignment of purity values to six carbohydrate certified reference materials, including glucose, fructose, galactose, lactose, xylose and sucrose, according to the ISO Guides 34 and 35. The CRMs' purity values were assigned based on the weighted average of quantitative nuclear magnetic resonance method and mass balance approach with high resolution liquid chromatography - evaporative light scattering detection. All the six CRMs with following value amount fractions: glucose (GBW10062) at a certified purity P ± U (k=2) of (0.99 ± 0.005)%; fructose (GBW10063) at (0.99 ± 0.005)%; galactose (GBW10064) at (0.99 ± 0.007)%; lactose (GBW10065) at (0.99 ± 0.008)%; xylose (GBW10066) at (0.99 ± 0.007)% and sucrose (GBW10067) at (0.99 ± 0.008)%, respectively were certified. The homogeneity of the CRMs was determined by an in-house validated liquid chromatographic method. Potential degradation during storage was also investigated and a shelf-life based on this value was established. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Rapid and Quantitative Detection of Vibrio parahemolyticus by the Mixed-Dye-Based Loop-Mediated Isothermal Amplification Assay on a Self-Priming Compartmentalization Microfluidic Chip.

    PubMed

    Pang, Bo; Ding, Xiong; Wang, Guoping; Zhao, Chao; Xu, Yanan; Fu, Kaiyue; Sun, Jingjing; Song, Xiuling; Wu, Wenshuai; Liu, Yushen; Song, Qi; Hu, Jiumei; Li, Juan; Mu, Ying

    2017-12-27

    Vibrio parahemolyticus (VP) mostly isolated from aquatic products is one of the major causes of bacterial food-poisoning events worldwide, which could be reduced using a promising on-site detection method. Herein, a rapid and quantitative method for VP detection was developed by applying a mixed-dye-loaded loop-mediated isothermal amplification (LAMP) assay on a self-priming compartmentalization (SPC) microfluidic chip, termed on-chip mixed-dye-based LAMP (CMD-LAMP). In comparison to conventional approaches, CMD-LAMP was advantageous on the limit of detection, which reached down to 1 × 10 3 CFU/mL in food-contaminated samples without the pre-enrichment of bacteria. Additionally, as a result of the use of a mixed dye and SPC chip, the quantitative result could be easily acquired, avoiding the requirement of sophisticated instruments and tedious operation. Also, CMD-LAMP was rapid and cost-effective. Conclusively, CMD-LAMP has great potential in realizing the on-site quantitative analysis of VP for food safety.

  6. Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns

    NASA Astrophysics Data System (ADS)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2013-03-01

    Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.

  7. iTRAQ-Based Quantitative Proteomic Analysis of Spirulina platensis in Response to Low Temperature Stress

    PubMed Central

    Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng

    2016-01-01

    Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance. PMID:27902743

  8. iTRAQ-Based Quantitative Proteomic Analysis of Spirulina platensis in Response to Low Temperature Stress.

    PubMed

    Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng

    2016-01-01

    Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance.

  9. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  10. A new systematic and quantitative approach to characterization of surface nanostructures using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Al-Mousa, Amjed A.

    Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison

  11. Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H

    2014-05-01

    Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. © 2013.

  12. Climate change and dengue: a critical and systematic review of quantitative modelling approaches

    PubMed Central

    2014-01-01

    Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. PMID:24669859

  13. FRET-based genetically-encoded sensors for quantitative monitoring of metabolites.

    PubMed

    Mohsin, Mohd; Ahmad, Altaf; Iqbal, Muhammad

    2015-10-01

    Neighboring cells in the same tissue can exist in different states of dynamic activities. After genomics, proteomics and metabolomics, fluxomics is now equally important for generating accurate quantitative information on the cellular and sub-cellular dynamics of ions and metabolite, which is critical for functional understanding of organisms. Various spectrometry techniques are used for monitoring ions and metabolites, although their temporal and spatial resolutions are limited. Discovery of the fluorescent proteins and their variants has revolutionized cell biology. Therefore, novel tools and methods targeting sub-cellular compartments need to be deployed in specific cells and targeted to sub-cellular compartments in order to quantify the target-molecule dynamics directly. We require tools that can measure cellular activities and protein dynamics with sub-cellular resolution. Biosensors based on fluorescence resonance energy transfer (FRET) are genetically encoded and hence can specifically target sub-cellular organelles by fusion to proteins or targetted sequences. Since last decade, FRET-based genetically encoded sensors for molecules involved in energy production, reactive oxygen species and secondary messengers have helped to unravel key aspects of cellular physiology. This review, describing the design and principles of sensors, presents a database of sensors for different analytes/processes, and illustrate examples of application in quantitative live cell imaging.

  14. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. An Efficient Audio Coding Scheme for Quantitative and Qualitative Large Scale Acoustic Monitoring Using the Sensor Grid Approach

    PubMed Central

    Gontier, Félix; Lagrange, Mathieu; Can, Arnaud; Lavandier, Catherine

    2017-01-01

    The spreading of urban areas and the growth of human population worldwide raise societal and environmental concerns. To better address these concerns, the monitoring of the acoustic environment in urban as well as rural or wilderness areas is an important matter. Building on the recent development of low cost hardware acoustic sensors, we propose in this paper to consider a sensor grid approach to tackle this issue. In this kind of approach, the crucial question is the nature of the data that are transmitted from the sensors to the processing and archival servers. To this end, we propose an efficient audio coding scheme based on third octave band spectral representation that allows: (1) the estimation of standard acoustic indicators; and (2) the recognition of acoustic events at state-of-the-art performance rate. The former is useful to provide quantitative information about the acoustic environment, while the latter is useful to gather qualitative information and build perceptually motivated indicators using for example the emergence of a given sound source. The coding scheme is also demonstrated to transmit spectrally encoded data that, reverted to the time domain using state-of-the-art techniques, are not intelligible, thus protecting the privacy of citizens. PMID:29186021

  16. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    PubMed

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  17. Fiber optic based multiparametric spectroscopy in vivo: Toward a new quantitative tissue vitality index

    NASA Astrophysics Data System (ADS)

    Kutai-Asis, Hofit; Barbiro-Michaely, Efrat; Deutsch, Assaf; Mayevsky, Avraham

    2006-02-01

    In our previous publication (Mayevsky et al SPIE 5326: 98-105, 2004) we described a multiparametric fiber optic system enabling the evaluation of 4 physiological parameters as indicators of tissue vitality. Since the correlation between the various parameters may differ in various pathophysiological conditions there is a need for an objective quantitative index that will integrate the relative changes measured in real time by the multiparametric monitoring system into a single number-vitality index. Such an approach to calculate tissue vitality index is critical for the possibility to use such an instrument in clinical environments. In the current presentation we are reporting our preliminary results indicating that calculation of an objective tissue vitality index is feasible. We used an intuitive empirical approach based on the comparison between the calculated index by the computer and the subjective evaluation made by an expert in the field of physiological monitoring. We used the in vivo brain of rats as an animal model in our current studies. The rats were exposed to anoxia, ischemia and cortical spreading depression and the responses were recorded in real time. At the end of the monitoring session the results were analyzed and the tissue vitality index was calculated offline. Mitochondrial NADH, tissue blood flow and oxy-hemoglobin were used to calculate the vitality index of the brain in vivo, where each parameter received a different weight, in each experiment type based on their significance. It was found that the mitochondrial NADH response was the main factor affected the calculated vitality index.

  18. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  20. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  1. Model-based approach for quantitative estimates of skin, heart, and lung toxicity risk for left-side photon and proton irradiation after breast-conserving surgery.

    PubMed

    Tommasino, Francesco; Durante, Marco; D'Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Farace, Paolo; Palma, Giuseppe; Schwarz, Marco; Cella, Laura; Pacelli, Roberto

    2017-05-01

    Proton beam therapy represents a promising modality for left-side breast cancer (BC) treatment, but concerns have been raised about skin toxicity and poor cosmesis. The aim of this study is to apply skin normal tissue complication probability (NTCP) model for intensity modulated proton therapy (IMPT) optimization in left-side BC. Ten left-side BC patients undergoing photon irradiation after breast-conserving surgery were randomly selected from our clinical database. Intensity modulated photon (IMRT) and IMPT plans were calculated with iso-tumor-coverage criteria and according to RTOG 1005 guidelines. Proton plans were computed with and without skin optimization. Published NTCP models were employed to estimate the risk of different toxicity endpoints for skin, lung, heart and its substructures. Acute skin NTCP evaluation suggests a lower toxicity level with IMPT compared to IMRT when the skin is included in proton optimization strategy (0.1% versus 1.7%, p < 0.001). Dosimetric results show that, with the same level of tumor coverage, IMPT attains significant heart and lung dose sparing compared with IMRT. By NTCP model-based analysis, an overall reduction in the cardiopulmonary toxicity risk prediction can be observed for all IMPT compared to IMRT plans: the relative risk reduction from protons varies between 0.1 and 0.7 depending on the considered toxicity endpoint. Our analysis suggests that IMPT might be safely applied without increasing the risk of severe acute radiation induced skin toxicity. The quantitative risk estimates also support the potential clinical benefits of IMPT for left-side BC irradiation due to lower risk of cardiac and pulmonary morbidity. The applied approach might be relevant on the long term for the setup of cost-effectiveness evaluation strategies based on NTCP predictions.

  2. Quantitative characterization of turbidity by radiative transfer based reflectance imaging

    PubMed Central

    Tian, Peng; Chen, Cheng; Jin, Jiahong; Hong, Heng; Lu, Jun Q.; Hu, Xin-Hua

    2018-01-01

    A new and noncontact approach of multispectral reflectance imaging has been developed to inversely determine the absorption coefficient of μa, the scattering coefficient of μs and the anisotropy factor g of a turbid target from one measured reflectance image. The incident beam was profiled with a diffuse reflectance standard for deriving both measured and calculated reflectance images. A GPU implemented Monte Carlo code was developed to determine the parameters with a conjugate gradient descent algorithm and the existence of unique solutions was shown. We noninvasively determined embedded region thickness in heterogeneous targets and estimated in vivo optical parameters of nevi from 4 patients between 500 and 950nm for melanoma diagnosis to demonstrate the potentials of quantitative reflectance imaging. PMID:29760971

  3. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  5. Tool use and affordance: Manipulation-based versus reasoning-based approaches.

    PubMed

    Osiurak, François; Badets, Arnaud

    2016-10-01

    Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. A virtual maintenance-based approach for satellite assembling and troubleshooting assessment

    NASA Astrophysics Data System (ADS)

    Geng, Jie; Li, Ying; Wang, Ranran; Wang, Zili; Lv, Chuan; Zhou, Dong

    2017-09-01

    In this study, a Virtual Maintenance (VM)-based approach for satellite troubleshooting assessment is proposed. By focusing on various elements in satellite assemble troubleshooting, such as accessibility, ergonomics, wiring, and extent of damage, a systematic, quantitative, and objective assessment model is established to decrease subjectivity in satellite assembling and troubleshooting assessment. Afterwards, based on the established assessment model and satellite virtual prototype, an application process of this model suitable for a virtual environment is presented. Finally, according to the application process, all the elements in satellite troubleshooting are analyzed and assessed. The corresponding improvements, which realize the transformation from a conventional way to a virtual simulation and assessment, are suggested, and the flaws in assembling and troubleshooting are revealed. Assembling or troubleshooting schemes can be improved in the early stage of satellite design with the help of a virtual prototype. Repetition in the practical operation is beneficial to companies as risk and cost are effectively reduced.

  7. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  8. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  9. Insights from quantitative metaproteomics and protein-stable isotope probing into microbial ecology.

    PubMed

    von Bergen, Martin; Jehmlich, Nico; Taubert, Martin; Vogt, Carsten; Bastida, Felipe; Herbst, Florian-Alexander; Schmidt, Frank; Richnow, Hans-Hermann; Seifert, Jana

    2013-10-01

    The recent development of metaproteomics has enabled the direct identification and quantification of expressed proteins from microbial communities in situ, without the need for microbial enrichment. This became possible by (1) significant increases in quality and quantity of metagenome data and by improvements of (2) accuracy and (3) sensitivity of modern mass spectrometers (MS). The identification of physiologically relevant enzymes can help to understand the role of specific species within a community or an ecological niche. Beside identification, relative and absolute quantitation is also crucial. We will review label-free and label-based methods of quantitation in MS-based proteome analysis and the contribution of quantitative proteome data to microbial ecology. Additionally, approaches of protein-based stable isotope probing (protein-SIP) for deciphering community structures are reviewed. Information on the species-specific metabolic activity can be obtained when substrates or nutrients are labeled with stable isotopes in a protein-SIP approach. The stable isotopes ((13)C, (15)N, (36)S) are incorporated into proteins and the rate of incorporation can be used for assessing the metabolic activity of the corresponding species. We will focus on the relevance of the metabolic and phylogenetic information retrieved with protein-SIP studies and for detecting and quantifying the carbon flux within microbial consortia. Furthermore, the combination of protein-SIP with established tools in microbial ecology such as other stable isotope probing techniques are discussed.

  10. Smartphone-based quantitative measurements on holographic sensors.

    PubMed

    Khalili Moghaddam, Gita; Lowe, Christopher Robin

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  11. Smartphone-based quantitative measurements on holographic sensors

    PubMed Central

    Khalili Moghaddam, Gita

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals. PMID:29141008

  12. Modeling the reactivities of hydroxyl radical and ozone towards atmospheric organic chemicals using quantitative structure-reactivity relationship approaches.

    PubMed

    Gupta, Shikha; Basant, Nikita; Mohan, Dinesh; Singh, Kunwar P

    2016-07-01

    The persistence and the removal of organic chemicals from the atmosphere are largely determined by their reactions with the OH radical and O3. Experimental determinations of the kinetic rate constants of OH and O3 with a large number of chemicals are tedious and resource intensive and development of computational approaches has widely been advocated. Recently, ensemble machine learning (EML) methods have emerged as unbiased tools to establish relationship between independent and dependent variables having a nonlinear dependence. In this study, EML-based, temperature-dependent quantitative structure-reactivity relationship (QSRR) models have been developed for predicting the kinetic rate constants for OH (kOH) and O3 (kO3) reactions with diverse chemicals. Structural diversity of chemicals was evaluated using a Tanimoto similarity index. The generalization and prediction abilities of the constructed models were established through rigorous internal and external validation performed employing statistical checks. In test data, the EML QSRR models yielded correlation (R (2)) of ≥0.91 between the measured and the predicted reactivities. The applicability domains of the constructed models were determined using methods based on descriptors range, Euclidean distance, leverage, and standardization approaches. The prediction accuracies for the higher reactivity compounds were relatively better than those of the low reactivity compounds. Proposed EML QSRR models performed well and outperformed the previous reports. The proposed QSRR models can make predictions of rate constants at different temperatures. The proposed models can be useful tools in predicting the reactivities of chemicals towards OH radical and O3 in the atmosphere.

  13. A quantitative framework for assessing ecological resilience

    EPA Science Inventory

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  14. Semi-quantitative visual detection of loop mediated isothermal amplification (LAMP)-generated DNA by distance-based measurement on a paper device.

    PubMed

    Hongwarittorrn, Irin; Chaichanawongsaroj, Nuntaree; Laiwattanapaisal, Wanida

    2017-12-01

    A distance-based paper analytical device (dPAD) for loop mediated isothermal amplification (LAMP) detection based on distance measurement was proposed. This approach relied on visual detection by the length of colour developed on the dPAD with reference to semi-quantitative determination of the initial amount of genomic DNA. In this communication, E. coli DNA was chosen as a template DNA for LAMP reaction. In accordance with the principle, the dPAD was immobilized by polyethylenimine (PEI), which is a strong cationic polymer, in the hydrophilic channel of the paper device. Hydroxynaphthol blue (HNB), a colourimetric indicator for monitoring the change of magnesium ion concentration in the LAMP reaction, was used to react with the immobilized PEI. The positive charges of PEI react with the negative charges of free HNB in the LAMP reaction, producing a blue colour deposit on the paper device. Consequently, the apparently visual distance appeared within 5min and length of distance correlated to the amount of DNA in the sample. The distance-based PAD for the visual detection of the LAMP reaction could quantify the initial concentration of genomic DNA as low as 4.14 × 10 3 copiesµL -1 . This distance-based visual semi-quantitative platform is suitable for choice of LAMP detection method, particular in resource-limited settings because of the advantages of low cost, simple fabrication and operation, disposability and portable detection of the dPAD device. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Plasmodium knowlesi transmission: integrating quantitative approaches from epidemiology and ecology to understand malaria as a zoonosis.

    PubMed

    Brock, P M; Fornace, K M; Parmiter, M; Cox, J; Drakeley, C J; Ferguson, H M; Kao, R R

    2016-04-01

    The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.

  16. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  17. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  18. Investigating the Pedagogical Approaches Related to Changes in Attitudes toward Statistics in a Quantitative Methods Course for Psychology Undergraduate Students

    ERIC Educational Resources Information Center

    Liau, Albert K.; Kiat, John E.; Nie, Youyan

    2015-01-01

    The purpose of this study was to examine the extent to which the pedagogical approaches used in the course were related to improvements in students' attitudes toward statistics in a Quantitative Methods course for psychology undergraduate students in a Malaysian University. The study examined whether increasing availability of the instructor and…

  19. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  20. A CZT-based blood counter for quantitative molecular imaging.

    PubMed

    Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe

    2017-12-01

    Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.

  1. A risk based approach for SSTO/TSTO comparisons

    NASA Astrophysics Data System (ADS)

    Greenberg, Joel S.

    1996-03-01

    An approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. Risk considerations are necessary since the transportation systems are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization. The approach considers the uncertainty of achievement of technology goals, effect that the achieved technology level will have on transportation system performance and the relationship between system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of nonrecurring, recurring, and the present value of transportation system life cycle costs.

  2. Strengths-based approach for mental health recovery.

    PubMed

    Xie, Huiting

    2013-01-01

    Many health systems have traditionally adopted a view of mental disorders based on pathologies and the risk individuals have towards mental disorders. However, with this approach, mental disorders continue to cost billions a year for the healthcare system. This paper aimed to introduce and explore what the strengths-based approach is in the psychiatric arena. Strengths-based approach moves the focus away from deficits of people with mental illnesses (consumers) and focuses on the strengths and resources of the consumers. The paper also aligned the relevance of strength-based approach to mental health nursing and its contribution to mental health recovery. None.

  3. Quantitative Susceptibility Mapping of the Midbrain in Parkinson’s Disease

    PubMed Central

    Du, Guangwei; Liu, Tian; Lewis, Mechelle M.; Kong, Lan; Wang, Yi; Connor, James; Mailman, Richard B.; Huang, Xuemei

    2017-01-01

    Background Parkinson’s disease (PD) is marked pathologically by dopamine neuron loss and iron overload in the substantia nigra pars compacta. Midbrain iron content is reported to be increased in PD based on magnetic resonance imaging (MRI) R2* changes. Because quantitative susceptibility mapping is a novel MRI approach to measure iron content, we compared it with R2* for assessing midbrain changes in PD. Methods Quantitative susceptibility mapping and R2* maps were obtained from 47 PD patients and 47 healthy controls. Midbrain susceptibility and R2* values were analyzed by using both voxel-based and region-of-interest approaches in normalized space, and analyzed along with clinical data, including disease duration, Unified Parkinson’s Disease Rating Scale (UPDRS) I, II, and III sub-scores, and levodopa-equivalent daily dosage. All studies were done while PD patients were “on drug.” Results Compared with controls, PD patients showed significantly increased susceptibility values in both right (cluster size = 106 mm3) and left (164 mm3) midbrain, located ventrolateral to the red nucleus that corresponded to the substantia nigra pars compacta. Susceptibility values in this region were correlated significantly with disease duration, UPDRS II, and levodopa-equivalent daily dosage. Conversely, R2* was increased significantly only in a much smaller region (62 mm3) of the left lateral substantia nigra pars compacta and was not significantly correlated with clinical parameters. Conclusion The use of quantitative susceptibility mapping demonstrated marked nigral changes that correlated with clinical PD status more sensitively than R2*. These data suggest that quantitative susceptibility mapping may be a superior imaging biomarker to R2* for estimating brain iron levels in PD. PMID:26362242

  4. 1, 2, 3, 4: infusing quantitative literacy into introductory biology.

    PubMed

    Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.

  5. 1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology

    PubMed Central

    Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965

  6. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  7. DISQOVER the Landcover - R based tools for quantitative vegetation reconstruction

    NASA Astrophysics Data System (ADS)

    Theuerkauf, Martin; Couwenberg, John; Kuparinen, Anna; Liebscher, Volkmar

    2016-04-01

    Quantitative methods have gained increasing attention in the field of vegetation reconstruction over the past decade. The DISQOVER package implements key tools in the R programming environment for statistical computing. This implementation has three main goals: 1) Provide a user-friendly, transparent, and open implementation of the methods 2) Provide full flexibility in all parameters (including the underlying pollen dispersal model) 3) Provide a sandbox for testing the sensitivity of the methods. We illustrate the possibilities of the package with tests of the REVEALS model and of the extended downscaling approach (EDA). REVEALS (Sugita 2007) is designed to translate pollen data from large lakes into regional vegetation composition. We applied REVEALSinR on pollen data from Lake Tiefer See (NE-Germany) and validated the results with historic landcover data. The results clearly show that REVEALS is sensitive to the underlying pollen dispersal model; REVEALS performs best when applied with the state of the art Lagrangian stochastic dispersal model. REVEALS applications with the conventional Gauss model can produce realistic results, but only if unrealistic pollen productivity estimates are used. The EDA (Theuerkauf et al. 2014) employs pollen data from many sites across a landscape to explore whether species distributions in the past were related to know stable patterns in the landscape, e.g. the distribution of soil types. The approach had so far only been implemented in simple settings with few taxa. Tests with EDAinR show that it produces sharp results in complex settings with many taxa as well. The DISQOVER package is open source software, available from disqover.uni-greifswald.de. This website can be used as a platform to discuss and improve quantitative methods in vegetation reconstruction. To introduce the tool we plan a short course in autumn of this year. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution

  8. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    ERIC Educational Resources Information Center

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  9. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  10. Fluvial drainage networks: the fractal approach as an improvement of quantitative geomorphic analyses

    NASA Astrophysics Data System (ADS)

    Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio

    2014-05-01

    Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins

  11. [Application of risk-based approach for determination of critical factors in technology transfer of production of medicinal products].

    PubMed

    Beregovykh, V V; Spitskiy, O R

    2014-01-01

    Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.

  12. Quantitative mouse brain phenotyping based on single and multispectral MR protocols

    PubMed Central

    Badea, Alexandra; Gewalt, Sally; Avants, Brian B.; Cook, James J.; Johnson, G. Allan

    2013-01-01

    Sophisticated image analysis methods have been developed for the human brain, but such tools still need to be adapted and optimized for quantitative small animal imaging. We propose a framework for quantitative anatomical phenotyping in mouse models of neurological and psychiatric conditions. The framework encompasses an atlas space, image acquisition protocols, and software tools to register images into this space. We show that a suite of segmentation tools (Avants, Epstein et al., 2008) designed for human neuroimaging can be incorporated into a pipeline for segmenting mouse brain images acquired with multispectral magnetic resonance imaging (MR) protocols. We present a flexible approach for segmenting such hyperimages, optimizing registration, and identifying optimal combinations of image channels for particular structures. Brain imaging with T1, T2* and T2 contrasts yielded accuracy in the range of 83% for hippocampus and caudate putamen (Hc and CPu), but only 54% in white matter tracts, and 44% for the ventricles. The addition of diffusion tensor parameter images improved accuracy for large gray matter structures (by >5%), white matter (10%), and ventricles (15%). The use of Markov random field segmentation further improved overall accuracy in the C57BL/6 strain by 6%; so Dice coefficients for Hc and CPu reached 93%, for white matter 79%, for ventricles 68%, and for substantia nigra 80%. We demonstrate the segmentation pipeline for the widely used C57BL/6 strain, and two test strains (BXD29, APP/TTA). This approach appears promising for characterizing temporal changes in mouse models of human neurological and psychiatric conditions, and may provide anatomical constraints for other preclinical imaging, e.g. fMRI and molecular imaging. This is the first demonstration that multiple MR imaging modalities combined with multivariate segmentation methods lead to significant improvements in anatomical segmentation in the mouse brain. PMID:22836174

  13. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  14. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.

    PubMed

    Chahrour, Osama; Malone, John

    2017-01-01

    Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Old and new approaches to the interpretation of acid-base metabolism, starting from historical data applied to diabetic acidosis.

    PubMed

    Mioni, Roberto; Marega, Alessandra; Lo Cicero, Marco; Montanaro, Domenico

    2016-11-01

    The approach to acid-base chemistry in medicine includes several methods. Currently, the two most popular procedures are derived from Stewart's studies and from the bicarbonate/BE-based classical formulation. Another method, unfortunately little known, follows the Kildeberg theory applied to acid-base titration. By using the data produced by Dana Atchley in 1933, regarding electrolytes and blood gas analysis applied to diabetes, we compared the three aforementioned methods, in order to highlight their strengths and their weaknesses. The results obtained, by reprocessing the data of Atchley, have shown that Kildeberg's approach, unlike the other two methods, is consistent, rational and complete for describing the organ-physiological behavior of the hydrogen ion turnover in human organism. In contrast, the data obtained using the Stewart approach and the bicarbonate-based classical formulation are misleading and fail to specify which organs or systems are involved in causing or maintaining the diabetic acidosis. Stewart's approach, despite being considered 'quantitative', does not propose in any way the concept of 'an amount of acid' and becomes even more confusing, because it is not clear how to distinguish between 'strong' and 'weak' ions. As for Stewart's approach, the classical method makes no distinction between hydrogen ions managed by the intermediate metabolism and hydroxyl ions handled by the kidney, but, at least, it is based on the concept of titration (base-excess) and indirectly defines the concept of 'an amount of acid'. In conclusion, only Kildeberg's approach offers a complete understanding of the causes and remedies against any type of acid-base disturbance.

  16. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  17. Optical properties of acute kidney injury measured by quantitative phase imaging

    PubMed Central

    Ban, Sungbea; Min, Eunjung; Baek, Songyee; Kwon, Hyug Moo; Popescu, Gabriel

    2018-01-01

    The diagnosis of acute kidney disease (AKI) has been examined mainly by histology, immunohistochemistry and western blot. Though these approaches are widely accepted in the field, it has an inherent limitation due to the lack of high-throughput and quantitative information. For a better understanding of prognosis in AKI, we present a new approach using quantitative phase imaging combined with a wide-field scanning platform. Through the phase-delay information from the tissue, we were able to predict a stage of AKI based on various optical properties such as light scattering coefficient and anisotropy. These optical parameters quantify the deterioration process of the AKI model of tissue. Our device would be a very useful tool when it is required to deliver fast feedback of tissue pathology or when diseases are related to mechanical properties such as fibrosis. PMID:29541494

  18. AN APPROACH TO THE DEVELOPMENT OF MODELS TO QUANTITATIVELY ASSESS THE EFFECTS OF EXPOSURE TO ENVIRONMENTALLY RELEVANT LEVELS OF ENDOCRINE DISRUPTORS

    EPA Science Inventory

    An approach to the development of quantitative models to assess the effects of exposure to environmentally relevant levels of endocrine disruptors on homeostasis in adults.

    Ben-Jonathan N, Cooper RL, Foster P, Hughes CL, Hoyer PB, Klotz D, Kohn M, Lamb DJ, Stancel GM.
    <...

  19. DMD-based quantitative phase microscopy and optical diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie

    2018-02-01

    Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.

  20. A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado

    USGS Publications Warehouse

    Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.

    2007-01-01

    Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model

  1. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  2. A versatile quantitation platform based on platinum nanoparticles incorporated volumetric bar-chart chip for highly sensitive assays.

    PubMed

    Wang, Yuzhen; Zhu, Guixian; Qi, Wenjin; Li, Ying; Song, Yujun

    2016-11-15

    Platinum nanoparticles incorporated volumetric bar-chart chip (PtNPs-V-Chip) is able to be used for point-of-care tests by providing quantitative and visualized readout without any assistance from instruments, data processing, or graphic plotting. To improve the sensitivity of PtNPs-V-Chip, hybridization chain reaction was employed in this quantitation platform for highly sensitive assays that can detect as low as 16 pM Ebola Virus DNA, 0.01ng/mL carcinoembryonic antigen (CEA), and the 10 HER2-expressing cancer cells. Based on this amplified strategy, a 100-fold decrease of detection limit was achieved for DNA by improving the number of platinum nanoparticle catalyst for the captured analyte. This quantitation platform can also distinguish single base mismatch of DNA hybridization and observe the concentration threshold of CEA. The new strategy lays the foundation for this quantitation platform to be applied in forensic analysis, biothreat detection, clinical diagnostics and drug screening. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. NNAlign: A Web-Based Prediction Method Allowing Non-Expert End-User Discovery of Sequence Motifs in Quantitative Peptide Data

    PubMed Central

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new “omics”-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign. PMID:22073191

  4. NNAlign: a web-based prediction method allowing non-expert end-user discovery of sequence motifs in quantitative peptide data.

    PubMed

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign.

  5. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  7. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  8. Concepts and challenges in quantitative pharmacology and model-based drug development.

    PubMed

    Zhang, Liping; Pfister, Marc; Meibohm, Bernd

    2008-12-01

    Model-based drug development (MBDD) has been recognized as a concept to improve the efficiency of drug development. The acceptance of MBDD from regulatory agencies, industry, and academia has been growing, yet today's drug development practice is still distinctly distant from MBDD. This manuscript is aimed at clarifying the concept of MBDD and proposing practical approaches for implementing MBDD in the pharmaceutical industry. The following concepts are defined and distinguished: PK-PD modeling, exposure-response modeling, pharmacometrics, quantitative pharmacology, and MBDD. MBDD is viewed as a paradigm and a mindset in which models constitute the instruments and aims of drug development efforts. MBDD covers the whole spectrum of the drug development process instead of being limited to a certain type of modeling technique or application area. The implementation of MBDD requires pharmaceutical companies to foster innovation and make changes at three levels: (1) to establish mindsets that are willing to get acquainted with MBDD, (2) to align processes that are adaptive to the requirements of MBDD, and (3) to create a closely collaborating organization in which all members play a role in MBDD. Pharmaceutical companies that are able to embrace the changes MBDD poses will likely be able to improve their success rate in drug development, and the beneficiaries will ultimately be the patients in need.

  9. Methods for Quantitative Creatinine Determination.

    PubMed

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  10. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  11. New optomechanical approach to quantitative characterization of fatigue behavior of dynamically loaded structures

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1995-06-01

    The basic relationships between stress and strain under cyclic conditions of loading are not at present well understood. It would seem that information of this type is vital for a fundamental approach to understand the fatigue behavior of dynamically loaded structures. In this paper, experimental and computational methods are utilized to study the fatigue behavior of a thin aluminum cantilever plate subjected to dynamic loading. The studies are performed by combining optomechanical and finite element methods. The cantilever plate is loaded periodically by excitation set at a fixed amplitude and at a specific resonance frequency of the plate. By continuously applying this type of loading and using holographic interferometry, the behavior of the plate during a specific period of time is investigated. Quantitative information is obtained from laser vibrometry data which are utilized by a finite element program to calculate strains and stresses assuming a homogeneous and isotropic material and constant strain elements. It is shown that the use of experimental and computational hybrid methodologies allows identification of different zones of the plate that are fatigue critical. This optomechanical approach proves to be a viable tool for understanding of fatigue behavior of mechanical components and for performing optimization of structures subjected to fatigue conditions.

  12. Simultaneous quantitation of sphingoid bases by UPLC-ESI-MS/MS with identical 13C-encoded internal standards.

    PubMed

    Mirzaian, M; Wisse, P; Ferraz, M J; Marques, A R A; Gaspar, P; Oussoren, S V; Kytidou, K; Codée, J D C; van der Marel, G; Overkleeft, H S; Aerts, J M

    2017-03-01

    Free sphingoid bases (lysosphingolipids) of primary storage sphingolipids are increased in tissues and plasma of several sphingolipidoses. As shown earlier by us, sphingoid bases can be accurately quantified using UPLC-ESI-MS/MS, particularly in combination with identical 13 C-encoded internal standards. The feasibility of simultaneous quantitation of sphingoid bases in plasma specimens spiked with a mixture of such standards is here described. The sensitivity and linearity of detection is excellent for all examined sphingoid bases (sphingosine, sphinganine, hexosyl-sphingosine (glucosylsphingosine), hexosyl 2 -sphingosine (lactosylsphingosine), hexosyl 3 -sphingosine (globotriaosylsphingosine), phosphorylcholine-sphingosine) in the relevant concentration range and the measurements show very acceptable intra- and inter-assay variation (<10% average). Plasma samples of a series of male and female Gaucher Disease and Fabry Disease patients were analyzed with the multiplex assay. The obtained data compare well to those earlier determined for plasma globotriaosylsphingosine and glucosylsphingosine in GD and FD patients. The same approach can be also applied to measure sphingolipids in the same sample. Following extraction of sphingolipids from the same sample these can be converted to sphingoid bases by microwave exposure and subsequently quantified using 13 C-encoded internal standards. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Quantitative multiplex detection of pathogen biomarkers

    DOEpatents

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  14. Quantitative multiplex detection of pathogen biomarkers

    DOEpatents

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  15. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-10-01

    Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  16. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    PubMed

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  17. Resolving the Quantitative-Qualitative Dilemma: A Critical Realist Approach

    ERIC Educational Resources Information Center

    Scott, David

    2007-01-01

    The philosophical issues underpinning the quantitative-qualitative divide in educational research are examined. Three types of argument which support a resolution are considered: pragmatism, false duality and warranty through triangulation. In addition a number of proposed strategies--alignment, sequencing, translation and triangulation--are…

  18. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  19. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  20. "Standards"-Based Mathematics Curricula and the Promotion of Quantitative Literacy in Elementary School

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2015-01-01

    Background: Prior research has shown that students taught using "Standards"-based mathematics curricula tend to outperform students on measures of mathematics achievement. However, little research has focused particularly on the promotion of student quantitative literacy (QLT). In this study, the potential influence of the…

  1. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  2. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE PAGES

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    2016-06-20

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  3. A rapid, automated approach to optimisation of multiple reaction monitoring conditions for quantitative bioanalytical mass spectrometry.

    PubMed

    Higton, D M

    2001-01-01

    An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. Copyright 2001 John Wiley & Sons, Ltd.

  4. A Mechanism-based 3D-QSAR Approach for Classification ...

    EPA Pesticide Factsheets

    Organophosphate (OP) and carbamate esters can inhibit acetylcholinesterase (AChE) by binding covalently to a serine residue in the enzyme active site, and their inhibitory potency depends largely on affinity for the enzyme and the reactivity of the ester. Despite this understanding, there has been no mechanism-based in silico approach for classification and prediction of the inhibitory potency of ether OPs or carbamates. This prompted us to develop a three dimensional prediction framework for OPs, carbamates, and their analogs. Inhibitory structures of a compound that can form the covalent bond were identified through analysis of docked conformations of the compound and its metabolites. Inhibitory potencies of the selected structures were then predicted using a previously developed three dimensional quantitative structure-active relationship. This approach was validated with a large number of structurally diverse OP and carbamate compounds encompassing widely used insecticides and structural analogs including OP flame retardants and thio- and dithiocarbamate pesticides. The modeling revealed that: (1) in addition to classical OP metabolic activation, the toxicity of carbamate compounds can be dependent on biotransformation, (2) OP and carbamate analogs such as OP flame retardants and thiocarbamate herbicides can act as AChEI, (3) hydrogen bonds at the oxyanion hole is critical for AChE inhibition through the covalent bond, and (4) π–π interaction with Trp86

  5. A Quantitative ADME-base Tool for Exploring Human ...

    EPA Pesticide Factsheets

    Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It

  6. The Significance of Temperature Based Approach Over the Energy Based Approaches in the Buildings Thermal Assessment

    NASA Astrophysics Data System (ADS)

    Albatayneh, Aiman; Alterman, Dariusz; Page, Adrian; Moghtaderi, Behdad

    2017-05-01

    The design of low energy buildings requires accurate thermal simulation software to assess the heating and cooling loads. Such designs should sustain thermal comfort for occupants and promote less energy usage over the life time of any building. One of the house energy rating used in Australia is AccuRate, star rating tool to assess and compare the thermal performance of various buildings where the heating and cooling loads are calculated based on fixed operational temperatures between 20 °C to 25 °C to sustain thermal comfort for the occupants. However, these fixed settings for the time and temperatures considerably increase the heating and cooling loads. On the other hand the adaptive thermal model applies a broader range of weather conditions, interacts with the occupants and promotes low energy solutions to maintain thermal comfort. This can be achieved by natural ventilation (opening window/doors), suitable clothes, shading and low energy heating/cooling solutions for the occupied spaces (rooms). These activities will save significant amount of operating energy what can to be taken into account to predict energy consumption for a building. Most of the buildings thermal assessment tools depend on energy-based approaches to predict the thermal performance of any building e.g. AccuRate in Australia. This approach encourages the use of energy to maintain thermal comfort. This paper describes the advantages of a temperature-based approach to assess the building's thermal performance (using an adaptive thermal comfort model) over energy based approach (AccuRate Software used in Australia). The temperature-based approach was validated and compared with the energy-based approach using four full scale housing test modules located in Newcastle, Australia (Cavity Brick (CB), Insulated Cavity Brick (InsCB), Insulated Brick Veneer (InsBV) and Insulated Reverse Brick Veneer (InsRBV)) subjected to a range of seasonal conditions in a moderate climate. The time required for

  7. An information communication technology based approach for the acquisition of critical thinking skills.

    PubMed

    Pucer, Patrik; Trobec, Irena; Žvanut, Boštjan

    2014-06-01

    Both academics and practitioners agree that critical thinking skills are necessary to provide safe and comprehensive nursing care. In order to promote the development of critical thinking, nurse educators need to keep the teaching/learning process captivating and interesting using active learning environments. These can be implemented by using modern information and communication technologies that are simple, fun, and time and cost effective. The goal of our study was to design and test an approach, which allows individual and fast acquisition of critical thinking skills with the use of information and communication technology. A combination of qualitative and quantitative research design was implemented. The study consisted of a quasi-experiment (phases 1-3): (1) pre-test discussion board, (2) use of e-contents based on the presented approach, and (3) post-test discussion board. The participants' opinion about the presented approach was identified in phase 4. The study was performed in May 2012 during the course "Ethics and Philosophy in Nursing" at the Faculty of Health Sciences, University of Primorska, Slovenia. Forty first-year undergraduate nursing students. Qualitative analysis of the discussion boards (phases 1, 3) and an anonymous survey with open- and closed-ended questions (phase 4). Qualitative analysis of the discussion boards showed a significant (p<0.001) improvement in the percentage of posts (12.2%) for which the opinions and conclusions of the participants were justified with valid arguments. The survey results indicated that participants perceived the e-contents based on the presented approach as useful, and that they improved their critical thinking skills. Repeated confirmation of the validity of the presented approach through methodological triangulation represents a strong indication that the presented approach is a valuable tool to develop nursing students' critical thinking skills. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Approaches to quantitating the results of differentially dyed cottons

    USDA-ARS?s Scientific Manuscript database

    The differential dyeing (DD) method has served as a subjective method for visually determining immature cotton fibers. In an attempt to quantitate the results of the differential dyeing method, and thus offer an efficient means of elucidating cotton maturity without visual discretion, image analysi...

  9. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Isotropic differential phase contrast microscopy for quantitative phase bio-imaging.

    PubMed

    Chen, Hsi-Hsun; Lin, Yu-Zi; Luo, Yuan

    2018-05-16

    Quantitative phase imaging (QPI) has been investigated to retrieve optical phase information of an object and applied to biological microscopy and related medical studies. In recent examples, differential phase contrast (DPC) microscopy can recover phase image of thin sample under multi-axis intensity measurements in wide-field scheme. Unlike conventional DPC, based on theoretical approach under partially coherent condition, we propose a new method to achieve isotropic differential phase contrast (iDPC) with high accuracy and stability for phase recovery in simple and high-speed fashion. The iDPC is simply implemented with a partially coherent microscopy and a programmable thin-film transistor (TFT) shield to digitally modulate structured illumination patterns for QPI. In this article, simulation results show consistency of our theoretical approach for iDPC under partial coherence. In addition, we further demonstrate experiments of quantitative phase images of a standard micro-lens array, as well as label-free live human cell samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Exploring the bases for a mixed reality stroke rehabilitation system, Part I: A unified approach for representing action, quantitative evaluation, and interactive feedback

    PubMed Central

    2011-01-01

    Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441

  12. Quantitative magnetic resonance (QMR) measurement of changes in body composition of neonatal pigs

    USDA-ARS?s Scientific Manuscript database

    The survival of low birth weight pigs in particular may depend on energy stores in the body. QMR (quantitative magnetic resonance) is a new approach to measuring total body fat, lean and water. These measurements are based on quantifying protons associated with lipid and water molecules in the body...

  13. An interaural-correlation-based approach that accounts for a wide variety of binaural detection data.

    PubMed

    Bernstein, Leslie R; Trahiotis, Constantine

    2017-02-01

    Interaural cross-correlation-based models of binaural processing have accounted successfully for a wide variety of binaural phenomena, including binaural detection, binaural discrimination, and measures of extents of laterality based on interaural temporal disparities, interaural intensitive disparities, and their combination. This report focuses on quantitative accounts of data obtained from binaural detection experiments published over five decades. Particular emphasis is placed on stimulus contexts for which commonly used correlation-based approaches fail to provide adequate explanations of the data. One such context concerns binaural detection of signals masked by certain noises that are narrow-band and/or interaurally partially correlated. It is shown that a cross-correlation-based model that includes stages of peripheral auditory processing can, when coupled with an appropriate decision variable, account well for a wide variety of classic and recently published binaural detection data including those that have, heretofore, proven to be problematic.

  14. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  15. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  16. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure-activity relationship.

    PubMed

    Wang, Hui; Jiang, Mingyue; Li, Shujun; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-09-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure-activity relationships (QSARs) for CAAS compounds against Aspergillus niger ( A. niger ) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models ( R 2  = 0.9346 for A. niger , R 2  = 0.9590 for P. citrinum, ) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi.

  17. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    PubMed Central

    Wang, Hui; Jiang, Mingyue; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models (R2 = 0.9346 for A. niger, R2 = 0.9590 for P. citrinum,) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi. PMID:28989758

  18. Quantitative micro-CT based coronary artery profiling using interactive local thresholding and cylindrical coordinates.

    PubMed

    Panetta, Daniele; Pelosi, Gualtiero; Viglione, Federica; Kusmic, Claudia; Terreni, Marianna; Belcari, Nicola; Guerra, Alberto Del; Athanasiou, Lambros; Exarchos, Themistoklis; Fotiadis, Dimitrios I; Filipovic, Nenad; Trivella, Maria Giovanna; Salvadori, Piero A; Parodi, Oberdan

    2015-01-01

    Micro-CT is an established imaging technique for high-resolution non-destructive assessment of vascular samples, which is gaining growing interest for investigations of atherosclerotic arteries both in humans and in animal models. However, there is still a lack in the definition of micro-CT image metrics suitable for comprehensive evaluation and quantification of features of interest in the field of experimental atherosclerosis (ATS). A novel approach to micro-CT image processing for profiling of coronary ATS is described, providing comprehensive visualization and quantification of contrast agent-free 3D high-resolution reconstruction of full-length artery walls. Accelerated coronary ATS has been induced by high fat cholesterol-enriched diet in swine and left coronary artery (LCA) harvested en bloc for micro-CT scanning and histologic processing. A cylindrical coordinate system has been defined on the image space after curved multiplanar reformation of the coronary vessel for the comprehensive visualization of the main vessel features such as wall thickening and calcium content. A novel semi-automatic segmentation procedure based on 2D histograms has been implemented and the quantitative results validated by histology. The potentiality of attenuation-based micro-CT at low kV to reliably separate arterial wall layers from adjacent tissue as well as identify wall and plaque contours and major tissue components has been validated by histology. Morphometric indexes from histological data corresponding to several micro-CT slices have been derived (double observer evaluation at different coronary ATS stages) and highly significant correlations (R2 > 0.90) evidenced. Semi-automatic morphometry has been validated by double observer manual morphometry of micro-CT slices and highly significant correlations were found (R2 > 0.92). The micro-CT methodology described represents a handy and reliable tool for quantitative high resolution and contrast agent free full length

  19. Traditional surgical planning of liver surgery is modified by 3D interactive quantitative surgical planning approach: a single-center experience with 305 patients.

    PubMed

    Wang, Xue-Dong; Wang, Hong-Guang; Shi, Jun; Duan, Wei-Dong; Luo, Ying; Ji, Wen-Bin; Zhang, Ning; Dong, Jia-Hong

    2017-06-01

    Decision making and surgical planning are to achieve the precise balance of maximal removal of target lesion, maximal sparing of functional liver remnant volume, and minimal surgical invasiveness and therefore, crucial in liver surgery. The aim of this prospective study was to validate the accuracy and predictability of 3D interactive quantitative surgical planning approach (IQSP), and to evaluate the impact of IQSP on traditional surgical plans based on 2D images. A total of 305 consecutive patients undergoing hepatectomy were included in this study. Surgical plans were created by traditional 2D approach using picture archiving and communication system (PACS) and 3D approach using IQSP respectively by two groups of physicians who did not know the surgical plans of the other group. The two surgical plans were submitted to the chief surgeon for selection before operation. The specimens were weighed. The two surgical plans were compared and analyzed retrospectively based on the operation results. The two surgical plans were successfully developed in all 305 patients and all the 3D IQSP surgical plans were selected as the final decision. Total 278 patients successfully underwent surgery, including 147 uncomplex hepatectomy and 131 complex hepatectomy. Twenty-seven patients were withdrawn from hepatectomy. In the uncomplex group, the two surgical plans were the same in all 147 patients and no statistically significant difference was found among 2D calculated resection volume (2D-RV), 3D IQSP calculated resection volume (IQSP-RV) and the specimen volume. In the complex group, the two surgical plans were different in 49 patients (49/131, 37.4%). According to the significance of differences, the 49 different patients were classified into three grades. No statistically significant difference was found between IQSP-RV and specimen volume. The coincidence rate of territory analysis of IQSP with operation was 92.1% (93/101) for 101 patients of anatomic hepatectomy. The

  20. Marker-based quantitative genetics in the wild?: the heritability and genetic correlation of chemical defenses in eucalyptus.

    PubMed

    Andrew, R L; Peakall, R; Wallis, I R; Wood, J T; Knight, E J; Foley, W J

    2005-12-01

    Marker-based methods for estimating heritability and genetic correlation in the wild have attracted interest because traditional methods may be impractical or introduce bias via G x E effects, mating system variation, and sampling effects. However, they have not been widely used, especially in plants. A regression-based approach, which uses a continuous measure of genetic relatedness, promises to be particularly appropriate for use in plants with mixed-mating systems and overlapping generations. Using this method, we found significant narrow-sense heritability of foliar defense chemicals in a natural population of Eucalyptus melliodora. We also demonstrated a genetic basis for the phenotypic correlation underlying an ecological example of conditioned flavor aversion involving different biosynthetic pathways. Our results revealed that heritability estimates depend on the spatial scale of the analysis in a way that offers insight into the distribution of genetic and environmental variance. This study is the first to successfully use a marker-based method to measure quantitative genetic parameters in a tree. We suggest that this method will prove to be a useful tool in other studies and offer some recommendations for future applications of the method.

  1. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  2. Quantitative multiplex detection of biomarkers on a waveguide-based biosensor using quantum dots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Hongzhi; Mukundan, Harshini; Martinez, Jennifer S

    2009-01-01

    multiple color imaging of live cells using QD-bioconjugates [Jaiswal 2003]. Gao [Gao 2004] and So [So 2006] have used QDs as probes for in-vivo cancer targeting and imaging. Medintz et al. reported self-assembled QD-based biosensors for detection of analytes based on energy transfer [Medintz 2003]. Others have developed an approach for multiplex optical encoding of biomolecules using QDs [Han 2001]. Immunoassays have also benefited from the advantages of QDs. Recently, dihydrolipoic acid (DHLA) capped-QDs have been attached to antibodies and used as fluorescence reporters in plate-based multiplex immunoassays [Goodman 2004]. However, DHLA-QDs are associated with low quantum efficiency and are unstable at neutral pH. These problems limit the application of this technology to the sensitive detection of biomolecules, especially in complex biological samples. Thus, the development of a rapid, sensitive, quantitative, and specific multiplex platform for the detection of biomarkers in difficult samples remains an elusive target. The goal stated above has applications in many fields including medical diagnostics, biological research, and threat reduction. The current decade alone has seen the development of a need to rapidly and accurately detect potential biological warfare agents. For example, current methods for the detection of anthrax are grossly inadequate for a variety of reasons including long incubation time (5 days from time of exposure to onset of symptoms) and non-specific ('flu-like') symptoms. When five employees of the United State Senate were exposed to B. anthracis in the mail (2001), only one patient had a confirmed diagnosis before death. Since then, sandwich immunoassays using both colorimetric and fluorescence detectors have been developed for key components of the anthrax lethal toxin, namely protective antigen (PA), lethal factor (LF), and the edema factor [Mourez 2001]. While these platforms were successful in assays against anthrax toxins, the sensitivity

  3. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  4. Network-based Approaches in Pharmacology.

    PubMed

    Boezio, Baptiste; Audouze, Karine; Ducrot, Pierre; Taboureau, Olivier

    2017-10-01

    In drug discovery, network-based approaches are expected to spotlight our understanding of drug action across multiple layers of information. On one hand, network pharmacology considers the drug response in the context of a cellular or phenotypic network. On the other hand, a chemical-based network is a promising alternative for characterizing the chemical space. Both can provide complementary support for the development of rational drug design and better knowledge of the mechanisms underlying the multiple actions of drugs. Recent progress in both concepts is discussed here. In addition, a network-based approach using drug-target-therapy data is introduced as an example. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. An evidential reasoning-based AHP approach for the selection of environmentally-friendly designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NG, C.Y., E-mail: ng.cy@cityu.edu.hk

    Due to the stringent environmental regulatory requirements being imposed by cross-national bodies in recent years, manufacturers have to minimize the environmental impact of their products. Among those environmental impact evaluation tools available, Life Cycle Assessment (LCA) is often employed to quantify the product's environmental impact throughout its entire life cycle. However, owing to the requirements of expert knowledge in environmental science and vast effort for data collection in carrying out LCA, as well as the common absence of complete product information during product development processes, there is a need to develop a more suitable tool for product designers. An evidentialmore » reasoning-based approach, which aims at providing a fast-track method to perform design alternative evaluations for non-LCA experts, is therefore introduced as a new initiative to deal with the incomplete or uncertain information. The proposed approach also enables decision makers to quantitatively assess the life cycle phases and design alternatives by comparing their potential environmental impacts, thus effectively and efficiently facilitates the identification of greener designs. A case application is carried out to demonstrate the applicability of the proposed approach.« less

  6. A Quantitative Study of Empathy in Pakistani Medical Students: A Multicentered Approach.

    PubMed

    Tariq, Nabia; Rasheed, Tariq; Tavakol, Mohsen

    2017-10-01

    To examine the empathy level of undergraduate medical students in Pakistan. Three hypotheses are developed based on the literature review. (1) Female medical students have a higher level of empathy than do male students. (2) Empathy scores vary during the medical school years in Pakistani students. (3) Medical students interested in people-oriented specialties would score higher than the students interested in technology-oriented specialties. This is a quantitative inquiry approach using a cross-sectional design of 1453 students from 8 Pakistani medical schools, both private and state. The sample consists of 41.1% (n = 597) male students and 58.9% (n = 856) female students. Empirical data are collected using the Jefferson Scale of Physician Empathy (JSPE), a well-validated self-administered questionnaire. The mean empathy score among students is 4.77 with a standard deviation of 0.72. The results show that there is no statistically significant association between the empathy scores and gender, t(1342.36) = -0.053, P = .95). There is a statistically significant difference between the empathy scores and the years of medical school, F(14, 1448) = 4.95, P = .01. Concerning the specialty interests, there is no statistically significant difference between the empathy score and specialty interests. The findings of this study showed that in Western countries, medical students performed better than Pakistani medical students on the empathy scale. This finding has important implications for Pakistani medical educators to improve the interpersonal skills of medical students in the context of patient care. Inconsistent with our expectations and experiences, our findings do not support that female medical students scored better than their male counterparts on the empathy scale. Because of the nature of a cross-sectional study, it is impossible to argue the decline of empathy during medical school training.

  7. Registration of 3D spectral OCT volumes combining ICP with a graph-based approach

    NASA Astrophysics Data System (ADS)

    Niemeijer, Meindert; Lee, Kyungmoo; Garvin, Mona K.; Abràmoff, Michael D.; Sonka, Milan

    2012-02-01

    The introduction of spectral Optical Coherence Tomography (OCT) scanners has enabled acquisition of high resolution, 3D cross-sectional volumetric images of the retina. 3D-OCT is used to detect and manage eye diseases such as glaucoma and age-related macular degeneration. To follow-up patients over time, image registration is a vital tool to enable more precise, quantitative comparison of disease states. In this work we present a 3D registrationmethod based on a two-step approach. In the first step we register both scans in the XY domain using an Iterative Closest Point (ICP) based algorithm. This algorithm is applied to vessel segmentations obtained from the projection image of each scan. The distance minimized in the ICP algorithm includes measurements of the vessel orientation and vessel width to allow for a more robust match. In the second step, a graph-based method is applied to find the optimal translation along the depth axis of the individual A-scans in the volume to match both scans. The cost image used to construct the graph is based on the mean squared error (MSE) between matching A-scans in both images at different translations. We have applied this method to the registration of Optic Nerve Head (ONH) centered 3D-OCT scans of the same patient. First, 10 3D-OCT scans of 5 eyes with glaucoma imaged in vivo were registered for a qualitative evaluation of the algorithm performance. Then, 17 OCT data set pairs of 17 eyes with known deformation were used for quantitative assessment of the method's robustness.

  8. Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.

    PubMed

    Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto

    2016-04-01

    MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.

  9. [Quantitative classification-based occupational health management for electroplating enterprises in Baoan District of Shenzhen, China].

    PubMed

    Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua

    2014-04-01

    To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.

  10. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  11. Quantitative segmentation of fluorescence microscopy images of heterogeneous tissue: Approach for tuning algorithm parameters

    NASA Astrophysics Data System (ADS)

    Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi

    2013-02-01

    The combination of fluorescent contrast agents with microscopy is a powerful technique to obtain real time images of tissue histology without the need for fixing, sectioning, and staining. The potential of this technology lies in the identification of robust methods for image segmentation and quantitation, particularly in heterogeneous tissues. Our solution is to apply sparse decomposition (SD) to monochrome images of fluorescently-stained microanatomy to segment and quantify distinct tissue types. The clinical utility of our approach is demonstrated by imaging excised margins in a cohort of mice after surgical resection of a sarcoma. Representative images of excised margins were used to optimize the formulation of SD and tune parameters associated with the algorithm. Our results demonstrate that SD is a robust solution that can advance vital fluorescence microscopy as a clinically significant technology.

  12. Artistic image analysis using graph-based learning approaches.

    PubMed

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  13. Quantitative systems toxicology

    PubMed Central

    Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.

    2017-01-01

    The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440

  14. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  15. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  16. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  17. Precise Quantitation of MicroRNA in a Single Cell with Droplet Digital PCR Based on Ligation Reaction.

    PubMed

    Tian, Hui; Sun, Yuanyuan; Liu, Chenghui; Duan, Xinrui; Tang, Wei; Li, Zhengping

    2016-12-06

    MicroRNA (miRNA) analysis in a single cell is extremely important because it allows deep understanding of the exact correlation between the miRNAs and cell functions. Herein, we wish to report a highly sensitive and precisely quantitative assay for miRNA detection based on ligation-based droplet digital polymerase chain reaction (ddPCR), which permits the quantitation of miRNA in a single cell. In this ligation-based ddPCR assay, two target-specific oligonucleotide probes can be simply designed to be complementary to the half-sequence of the target miRNA, respectively, which avoids the sophisticated design of reverse transcription and provides high specificity to discriminate a single-base difference among miRNAs with simple operations. After the miRNA-templated ligation, the ddPCR partitions individual ligated products into a water-in-oil droplet and digitally counts the fluorescence-positive and negative droplets after PCR amplification for quantification of the target molecules, which possesses the power of precise quantitation and robustness to variation in PCR efficiency. By integrating the advantages of the precise quantification of ddPCR and the simplicity of the ligation-based PCR, the proposed method can sensitively measure let-7a miRNA with a detection limit of 20 aM (12 copies per microliter), and even a single-base difference can be discriminated in let-7 family members. More importantly, due to its high selectivity and sensitivity, the proposed method can achieve precise quantitation of miRNAs in single-cell lysate. Therefore, the ligation-based ddPCR assay may serve as a useful tool to exactly reveal the miRNAs' actions in a single cell, which is of great importance for the study of miRNAs' biofunction as well as for the related biomedical studies.

  18. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  19. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  20. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Coupled-cluster based approach for core-level states in condensed phase: Theory and application to different protonated forms of aqueous glycine

    DOE PAGES

    Sadybekov, Arman; Krylov, Anna I.

    2017-07-07

    A theoretical approach for calculating core-level states in condensed phase is presented. The approach is based on equation-of-motion coupled-cluster theory (EOMCC) and effective fragment potential (EFP) method. By introducing an approximate treatment of double excitations in the EOM-CCSD (EOM-CC with single and double substitutions) ansatz, we address poor convergence issues that are encountered for the core-level states and significantly reduce computational costs. While the approximations introduce relatively large errors in the absolute values of transition energies, the errors are systematic. Consequently, chemical shifts, changes in ionization energies relative to reference systems, are reproduced reasonably well. By using different protonation formsmore » of solvated glycine as a benchmark system, we show that our protocol is capable of reproducing the experimental chemical shifts with a quantitative accuracy. The results demonstrate that chemical shifts are very sensitive to the solvent interactions and that explicit treatment of solvent, such as EFP, is essential for achieving quantitative accuracy.« less

  2. Coupled-cluster based approach for core-level states in condensed phase: Theory and application to different protonated forms of aqueous glycine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadybekov, Arman; Krylov, Anna I.

    A theoretical approach for calculating core-level states in condensed phase is presented. The approach is based on equation-of-motion coupled-cluster theory (EOMCC) and effective fragment potential (EFP) method. By introducing an approximate treatment of double excitations in the EOM-CCSD (EOM-CC with single and double substitutions) ansatz, we address poor convergence issues that are encountered for the core-level states and significantly reduce computational costs. While the approximations introduce relatively large errors in the absolute values of transition energies, the errors are systematic. Consequently, chemical shifts, changes in ionization energies relative to reference systems, are reproduced reasonably well. By using different protonation formsmore » of solvated glycine as a benchmark system, we show that our protocol is capable of reproducing the experimental chemical shifts with a quantitative accuracy. The results demonstrate that chemical shifts are very sensitive to the solvent interactions and that explicit treatment of solvent, such as EFP, is essential for achieving quantitative accuracy.« less

  3. Label-Free, LC-MS-Based Assays to Quantitate Small-Molecule Antagonist Binding to the Mammalian BLT1 Receptor.

    PubMed

    Chen, Xun; Stout, Steven; Mueller, Uwe; Boykow, George; Visconti, Richard; Siliphaivanh, Phieng; Spencer, Kerrie; Presland, Jeremy; Kavana, Michael; Basso, Andrea D; McLaren, David G; Myers, Robert W

    2017-08-01

    We have developed and validated label-free, liquid chromatography-mass spectrometry (LC-MS)-based equilibrium direct and competition binding assays to quantitate small-molecule antagonist binding to recombinant human and mouse BLT1 receptors expressed in HEK 293 cell membranes. Procedurally, these binding assays involve (1) equilibration of the BLT1 receptor and probe ligand, with or without a competitor; (2) vacuum filtration through cationic glass fiber filters to separate receptor-bound from free probe ligand; and (3) LC-MS analysis in selected reaction monitoring mode for bound probe ligand quantitation. Two novel, optimized probe ligands, compounds 1 and 2, were identified by screening 20 unlabeled BLT1 antagonists for direct binding. Saturation direct binding studies confirmed the high affinity, and dissociation studies established the rapid binding kinetics of probe ligands 1 and 2. Competition binding assays were established using both probe ligands, and the affinities of structurally diverse BLT1 antagonists were measured. Both binding assay formats can be executed with high specificity and sensitivity and moderate throughput (96-well plate format) using these approaches. This highly versatile, label-free method for studying ligand binding to membrane-associated receptors should find broad application as an alternative to traditional methods using labeled ligands.

  4. Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Ethene Sites

    DTIC Science & Technology

    2015-12-01

    FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical

  5. A new approach to hand-based authentication

    NASA Astrophysics Data System (ADS)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  6. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less

  7. Quantitative measures of healthy aging and biological age

    PubMed Central

    Kim, Sangkyu; Jazwinski, S. Michal

    2015-01-01

    Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669

  8. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  9. A disciplined approach to capital: today's healthcare imperative.

    PubMed

    Dupuis, Patrick J; Kaufman, Kenneth

    2007-07-01

    BJC HealthCare's experience exemplifies several basic principles of a finance-based approach to capital. Organizations that adopt this approach look to improve processes first, remove costs second, and spend capital last. Multiyear planning is required to quantitatively identify the profitability and liquidity requirements of strategic initiatives and address essential funding and financing issues.

  10. Quantitative breast tissue characterization using grating-based x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Willner, M.; Herzen, J.; Grandl, S.; Auweter, S.; Mayr, D.; Hipp, A.; Chabior, M.; Sarapata, A.; Achterhold, K.; Zanette, I.; Weitkamp, T.; Sztrókay, A.; Hellerhoff, K.; Reiser, M.; Pfeiffer, F.

    2014-04-01

    X-ray phase-contrast imaging has received growing interest in recent years due to its high capability in visualizing soft tissue. Breast imaging became the focus of particular attention as it is considered the most promising candidate for a first clinical application of this contrast modality. In this study, we investigate quantitative breast tissue characterization using grating-based phase-contrast computed tomography (CT) at conventional polychromatic x-ray sources. Different breast specimens have been scanned at a laboratory phase-contrast imaging setup and were correlated to histopathology. Ascertained tumor types include phylloides tumor, fibroadenoma and infiltrating lobular carcinoma. Identified tissue types comprising adipose, fibroglandular and tumor tissue have been analyzed in terms of phase-contrast Hounsfield units and are compared to high-quality, high-resolution data obtained with monochromatic synchrotron radiation, as well as calculated values based on tabulated tissue properties. The results give a good impression of the method’s prospects and limitations for potential tumor detection and the associated demands on such a phase-contrast breast CT system. Furthermore, the evaluated quantitative tissue values serve as a reference for simulations and the design of dedicated phantoms for phase-contrast mammography.

  11. Experiences and expectations of women with urogenital prolapse: a quantitative and qualitative exploration.

    PubMed

    Srikrishna, S; Robinson, D; Cardozo, L; Cartwright, R

    2008-10-01

    To explore the expectations and goals of women undergoing surgery for urogenital prolapse using both a quantitative quality of life approach exploring symptom bother and a qualitative interview-based approach exploring patient goals and expectations. Prospective observational study. Tertiary referral centre for urogynaecology. Forty-three women with symptomatic pelvic organ prolapse were recruited from the waiting list for pelvic floor reconstructive surgery. All women were assessed with a structured clinical interview on an individual basis. The data obtained were transcribed verbatim and then analysed thematically based on the grounded theory. Individual codes and subcodes were identified to develop a coding framework. The prolapse quality-of-life (pQoL) questionnaire was used to determine the impact of pelvic organ prolapse on the woman's daily life. We arbitrarily classified 'bother' as minimal, mild, moderate and marked if scores ranged from 0 to 25, 25-50, 50-75 and 75-100, respectively. The degree of prolapse was objectively quantified using the pelvic organ prolapse quantification (POP-Q) system. Quantitative data were analysed using SPSS. Ethical approval was obtained from the Kings College Hospital Ethics Committee. Quantitative data from POP-Q, subjective data from pQoL, qualitative data based on the structured clinical interview. Forty-three women were recruited over the first 1 year of the study. Their mean age was 56 years (range 36-78) and mean parity was 2 (range 0-6). The mean ordinal stage of the prolapse was 2 (range stages 1-4). Quantitative analysis of the pQoL data suggested that the main domains affected were prolapse impact on life (mean score 74.71) and personal relationships (mean score 46.66). Qualitative analysis based on the clinical interview suggested that these women were most affected by the actual physical symptoms of prolapse (bulge, pain and bowel problems) as well by the impact prolapse has on their sexual function. While

  12. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    PubMed

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  13. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  14. Professional Learning: A Fuzzy Logic-Based Modelling Approach

    ERIC Educational Resources Information Center

    Gravani, M. N.; Hadjileontiadou, S. J.; Nikolaidou, G. N.; Hadjileontiadis, L. J.

    2007-01-01

    Studies have suggested that professional learning is influenced by two key parameters, i.e., climate and planning, and their associated variables (mutual respect, collaboration, mutual trust, supportiveness, openness). In this paper, we applied analysis of the relationships between the proposed quantitative, fuzzy logic-based model and a series of…

  15. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  16. Quantitative nanohistological investigation of scleroderma: an atomic force microscopy-based approach to disease characterization

    PubMed Central

    Strange, Adam P; Aguayo, Sebastian; Ahmed, Tarek; Mordan, Nicola; Stratton, Richard; Porter, Stephen R; Parekh, Susan; Bozec, Laurent

    2017-01-01

    Scleroderma (or systemic sclerosis, SSc) is a disease caused by excess crosslinking of collagen. The skin stiffens and becomes painful, while internally, organ function can be compromised by the less elastic collagen. Diagnosis of SSc is often only possible in advanced cases by which treatment time is limited. A more detailed analysis of SSc may provide better future treatment options and information of disease progression. Recently, the histological stain picrosirius red showing collagen register has been combined with atomic force microscopy (AFM) to study SSc. Skin from healthy individuals and SSc patients was biopsied, stained and studied using AFM. By investigating the crosslinking of collagen at a smaller hierarchical stage, the effects of SSc were more pronounced. Changes in morphology and Young’s elastic modulus were observed and quantified; giving rise to a novel technique, we have termed “quantitative nanohistology”. An increase in nanoscale stiffness in the collagen for SSc compared with healthy individuals was seen by a significant increase in the Young’s modulus profile for the collagen. These markers of stiffer collagen in SSc are similar to the symptoms experienced by patients, giving additional hope that in the future, nanohistology using AFM can be readily applied as a clinical tool, providing detailed information of the state of collagen. PMID:28138238

  17. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing

  18. Quantitation of 47 human tear proteins using high resolution multiple reaction monitoring (HR-MRM) based-mass spectrometry.

    PubMed

    Tong, Louis; Zhou, Xi Yuan; Jylha, Antti; Aapola, Ulla; Liu, Dan Ning; Koh, Siew Kwan; Tian, Dechao; Quah, Joanne; Uusitalo, Hannu; Beuerman, Roger W; Zhou, Lei

    2015-02-06

    Tear proteins are intimately related to the pathophysiology of the ocular surface. Many recent studies have demonstrated that the tear is an accessible fluid for studying eye diseases and biomarker discovery. This study describes a high resolution multiple reaction monitoring (HR-MRM) approach for developing assays for quantification of biologically important tear proteins. Human tear samples were collected from 1000 subjects with no eye complaints (411 male, 589 female, average age: 55.5±14.5years) after obtaining informed consent. Tear samples were collected using Schirmer's strips and pooled into a single global control sample. Quantification of proteins was carried out by selecting "signature" peptides derived by trypsin digestion. A 1-h nanoLC-MS/MS run was used to quantify the tear proteins in HR-MRM mode. Good reproducibility of signal intensity (using peak areas) was demonstrated for all 47 HR-MRM assays with an average coefficient of variation (CV%) of 4.82% (range: 1.52-10.30%). All assays showed consistent retention time with a CV of less than 0.80% (average: 0.57%). HR-MRM absolute quantitation of eight tear proteins was demonstrated using stable isotope-labeled peptides. In this study, we demonstrated for the first time the technique to quantify 47 human tear proteins in HR-MRM mode using approximately 1μl of human tear sample. These multiplexed HR-MRM-based assays show great promise of further development for biomarker validation in human tear samples. Both discovery-based and targeted quantitative proteomics can be achieved in a single quadrupole time-of-flight mass spectrometer platform (TripleTOF 5600 system). Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Monitoring with Trackers Based on Semi-Quantitative Models

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1997-01-01

    In three years of NASA-sponsored research preceding this project, we successfully developed a technology for: (1) building qualitative and semi-quantitative models from libraries of model-fragments, (2) simulating these models to predict future behaviors with the guarantee that all possible behaviors are covered, (3) assimilating observations into behaviors, shrinking uncertainty so that incorrect models are eventually refuted and correct models make stronger predictions for the future. In our object-oriented framework, a tracker is an object which embodies the hypothesis that the available observation stream is consistent with a particular behavior of a particular model. The tracker maintains its own status (consistent, superceded, or refuted), and answers questions about its explanation for past observations and its predictions for the future. In the MIMIC approach to monitoring of continuous systems, a number of trackers are active in parallel, representing alternate hypotheses about the behavior of a system. This approach is motivated by the need to avoid 'system accidents' [Perrow, 1985] due to operator fixation on a single hypothesis, as for example at Three Mile Island. As we began to address these issues, we focused on three major research directions that we planned to pursue over a three-year project: (1) tractable qualitative simulation, (2) semiquantitative inference, and (3) tracking set management. Unfortunately, funding limitations made it impossible to continue past year one. Nonetheless, we made major progress in the first two of these areas. Progress in the third area as slower because the graduate student working on that aspect of the project decided to leave school and take a job in industry. I enclosed a set of abstract of selected papers on the work describe below. Several papers that draw on the research supported during this period appeared in print after the grant period ended.

  20. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    PubMed

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.