DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi
2016-03-01
This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.
Examining the Effects of Classroom Discussion on Students' Comprehension of Text: A Meta-Analysis
ERIC Educational Resources Information Center
Murphy, P. Karen; Wilkinson, Ian A. G.; Soter, Anna O.; Hennessey, Maeghan N.; Alexander, John F.
2009-01-01
The role of classroom discussions in comprehension and learning has been the focus of investigations since the early 1960s. Despite this long history, no syntheses have quantitatively reviewed the vast body of literature on classroom discussions for their effects on students' comprehension and learning. This comprehensive meta-analysis of…
ERIC Educational Resources Information Center
Lan, Yi-Chin; Lo, Yu-Ling; Hsu, Ying-Shao
2014-01-01
Comprehension is the essence of reading. Finding appropriate and effective reading strategies to support students' reading comprehension has always been a critical issue for educators. This article presents findings from a meta-analysis of 17 studies of metacognitive strategy instruction on students' reading comprehension in computerized…
ERIC Educational Resources Information Center
Currier, Joseph M.; Neimeyer, Robert A.; Berman, Jeffrey S.
2008-01-01
Previous quantitative reviews of research on psychotherapeutic interventions for bereaved persons have yielded divergent findings and have not included many of the available controlled outcome studies. This meta-analysis summarizes results from 61 controlled studies to offer a more comprehensive integration of this literature. This review examined…
Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo
2013-10-10
The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi
2018-05-03
Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Alleviating Comprehension Problems in Movies. Working Paper.
ERIC Educational Resources Information Center
Tatsuki, Donna
This paper describes the various barriers to comprehension that learners may encounter when viewing feature films in a second language. Two clusters of interfacing factors that may contribute to comprehension hot spots emerged from a quantitative analysis of problems noted in student logbooks. One cluster had a strong acoustic basis, whereas the…
Comprehensive Quantitative Analysis on Privacy Leak Behavior
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046
Comprehensive quantitative analysis on privacy leak behavior.
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.
Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911
Li, Weizhe; Germain, Ronald N.
2017-01-01
Organ homeostasis, cellular differentiation, signal relay, and in situ function all depend on the spatial organization of cells in complex tissues. For this reason, comprehensive, high-resolution mapping of cell positioning, phenotypic identity, and functional state in the context of macroscale tissue structure is critical to a deeper understanding of diverse biological processes. Here we report an easy to use method, clearing-enhanced 3D (Ce3D), which generates excellent tissue transparency for most organs, preserves cellular morphology and protein fluorescence, and is robustly compatible with antibody-based immunolabeling. This enhanced signal quality and capacity for extensive probe multiplexing permits quantitative analysis of distinct, highly intermixed cell populations in intact Ce3D-treated tissues via 3D histo-cytometry. We use this technology to demonstrate large-volume, high-resolution microscopy of diverse cell types in lymphoid and nonlymphoid organs, as well as to perform quantitative analysis of the composition and tissue distribution of multiple cell populations in lymphoid tissues. Combined with histo-cytometry, Ce3D provides a comprehensive strategy for volumetric quantitative imaging and analysis that bridges the gap between conventional section imaging and disassociation-based techniques. PMID:28808033
Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F
2013-09-01
A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.
Haiyang, Yu; Tian, Luo
2016-06-01
Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.
Claus, Rainer; Lucas, David M.; Stilgenbauer, Stephan; Ruppert, Amy S.; Yu, Lianbo; Zucknick, Manuela; Mertens, Daniel; Bühler, Andreas; Oakes, Christopher C.; Larson, Richard A.; Kay, Neil E.; Jelinek, Diane F.; Kipps, Thomas J.; Rassenti, Laura Z.; Gribben, John G.; Döhner, Hartmut; Heerema, Nyla A.; Marcucci, Guido; Plass, Christoph; Byrd, John C.
2012-01-01
Purpose Increased ZAP-70 expression predicts poor prognosis in chronic lymphocytic leukemia (CLL). Current methods for accurately measuring ZAP-70 expression are problematic, preventing widespread application of these tests in clinical decision making. We therefore used comprehensive DNA methylation profiling of the ZAP-70 regulatory region to identify sites important for transcriptional control. Patients and Methods High-resolution quantitative DNA methylation analysis of the entire ZAP-70 gene regulatory regions was conducted on 247 samples from patients with CLL from four independent clinical studies. Results Through this comprehensive analysis, we identified a small area in the 5′ regulatory region of ZAP-70 that showed large variability in methylation in CLL samples but was universally methylated in normal B cells. High correlation with mRNA and protein expression, as well as activity in promoter reporter assays, revealed that within this differentially methylated region, a single CpG dinucleotide and neighboring nucleotides are particularly important in ZAP-70 transcriptional regulation. Furthermore, by using clustering approaches, we identified a prognostic role for this site in four independent data sets of patients with CLL using time to treatment, progression-free survival, and overall survival as clinical end points. Conclusion Comprehensive quantitative DNA methylation analysis of the ZAP-70 gene in CLL identified important regions responsible for transcriptional regulation. In addition, loss of methylation at a specific single CpG dinucleotide in the ZAP-70 5′ regulatory sequence is a highly predictive and reproducible biomarker of poor prognosis in this disease. This work demonstrates the feasibility of using quantitative specific ZAP-70 methylation analysis as a relevant clinically applicable prognostic test in CLL. PMID:22564988
Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong
2017-04-01
An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.
Comprehending News Videotexts: The Influence of the Visual Content
ERIC Educational Resources Information Center
Cross, Jeremy
2011-01-01
Informed by dual coding theory, this study explores the role of the visual content in L2 listeners' comprehension of news videotexts. L1 research into the visual characteristics and comprehension of news videotexts is outlined, subsequently informing the quantitative analysis of audiovisual correspondence in the news videotexts used. In each of…
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
West, Aaron C; Schmidt, Michael W; Gordon, Mark S; Ruedenberg, Klaus
2015-10-15
The analysis of molecular electron density matrices in terms of quasi-atomic orbitals, which was developed in previous investigations, is quantitatively exemplified by a detailed application to the urea molecule. The analysis is found to identify strong and weak covalent bonding interactions as well as intramolecular charge transfers. It yields a qualitative as well as quantitative ab initio description of the bonding structure of this molecule, which raises questions regarding some traditional rationalizations.
ERIC Educational Resources Information Center
Grinias, James P.
2017-01-01
Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis
Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.
2017-01-01
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.
Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P
2017-04-07
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.
Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepinski, James
2013-09-30
A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less
This report presents a comprehensive spectral analysis of common bacterial phospholipids using electrospray/mass spectrometry (ESI/MS) under both negative and positive ionization conditions. Phospholipids under positive ionization yield sodium-adduct molecular ions which are mos...
Pesavento, James J; Bullock, Courtney R; LeDuc, Richard D; Mizzen, Craig A; Kelleher, Neil L
2008-05-30
Quantitative proteomics has focused heavily on correlating protein abundances, ratios, and dynamics by developing methods that are protein expression-centric (e.g. isotope coded affinity tag, isobaric tag for relative and absolute quantification, etc.). These methods effectively detect changes in protein abundance but fail to provide a comprehensive perspective of the diversity of proteins such as histones, which are regulated by post-translational modifications. Here, we report the characterization of modified forms of HeLa cell histone H4 with a dynamic range >10(4) using a strictly Top Down mass spectrometric approach coupled with two dimensions of liquid chromatography. This enhanced dynamic range enabled the precise characterization and quantitation of 42 forms uniquely modified by combinations of methylation and acetylation, including those with trimethylated Lys-20, monomethylated Arg-3, and the novel dimethylated Arg-3 (each <1% of all H4 forms). Quantitative analyses revealed distinct trends in acetylation site occupancy depending on Lys-20 methylation state. Because both modifications are dynamically regulated through the cell cycle, we simultaneously investigated acetylation and methylation kinetics through three cell cycle phases and used these data to statistically assess the robustness of our quantitative analysis. This work represents the most comprehensive analysis of histone H4 forms present in human cells reported to date.
APPLICATION OF THE MASTER ANALYTICAL SCHEME TO POLAR ORGANICS IN DRINKING WATER
EPA's Master Analytical Scheme (MAS) for Organic Compounds in Water provides for comprehensive qualitative-quantitative analysis of gas chromatographable organics in many types of water. The paper emphasizes the analysis of polar and ionic organics, the more water soluble compoun...
Quantitative Proteomics Identifies Activation of Hallmark Pathways of Cancer in Patient Melanoma.
Byrum, Stephanie D; Larson, Signe K; Avaritt, Nathan L; Moreland, Linley E; Mackintosh, Samuel G; Cheung, Wang L; Tackett, Alan J
2013-03-01
Molecular pathways regulating melanoma initiation and progression are potential targets of therapeutic development for this aggressive cancer. Identification and molecular analysis of these pathways in patients has been primarily restricted to targeted studies on individual proteins. Here, we report the most comprehensive analysis of formalin-fixed paraffin-embedded human melanoma tissues using quantitative proteomics. From 61 patient samples, we identified 171 proteins varying in abundance among benign nevi, primary melanoma, and metastatic melanoma. Seventy-three percent of these proteins were validated by immunohistochemistry staining of malignant melanoma tissues from the Human Protein Atlas database. Our results reveal that molecular pathways involved with tumor cell proliferation, motility, and apoptosis are mis-regulated in melanoma. These data provide the most comprehensive proteome resource on patient melanoma and reveal insight into the molecular mechanisms driving melanoma progression.
ERIC Educational Resources Information Center
Crismore, Avon, Ed.
The 594 studies referenced in this volume, the first of a two-volume set, synthesize reading comprehension research conducted from 1974 to 1984. The 12 chapters cover the following topics: (1) the necessity for both quantitative and qualitative meta-analysis for a complete state-of-the-art understanding of any domain, (2) the background of the…
Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.
2017-01-01
Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533
Quantitative image processing in fluid mechanics
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus; Helman, James; Ning, Paul
1992-01-01
The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.
75 FR 9488 - Basel Comprehensive Quantitative Impact Study
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-02
... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Basel Comprehensive Quantitative Impact... Quantitative Impact Study. OMB Number: 1550-0NEW. Form Numbers: N/A. Regulation requirement: 12 CFR Part 567... Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to assess...
A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.
Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B
2015-12-04
A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.
An Analysis of Corporal Punishment Practices in the State of Mississippi
ERIC Educational Resources Information Center
Williams-Damond, Twyla A.
2014-01-01
The focus of this research mainly centers on a quantitative descriptive overview of corporal punishment practices in the state of Mississippi, but this study also includes a legal document analysis component. This study forms the Mississippi portion of a comprehensive analysis of the demographics of corporal punishment in the public schools of the…
Haramija, Marko; Peter-Katalinić, Jasna
2017-10-30
Affinity mass spectrometry (AMS) is an emerging tool in the field of the study of protein•carbohydrate complexes. However, experimental obstacles and data analysis are preventing faster integration of AMS methods into the glycoscience field. Here we show how analysis of direct electrospray ionization mass spectrometry (ESI-MS) AMS data can be simplified for screening purposes, even for complex AMS spectra. A direct ESI-MS assay was tested in this study and binding data for the galectin-3C•lactose complex were analyzed using a comprehensive and simplified data analysis approach. In the comprehensive data analysis approach, noise, all protein charge states, alkali ion adducts and signal overlap were taken into account. In a simplified approach, only the intensities of the fully protonated free protein and the protein•carbohydrate complex for the main protein charge state were taken into account. In our study, for high intensity signals, noise was negligible, sodiated protein and sodiated complex signals cancelled each other out when calculating the K d value, and signal overlap influenced the Kd value only to a minor extent. Influence of these parameters on low intensity signals was much higher. However, low intensity protein charge states should be avoided in quantitative AMS analyses due to poor ion statistics. The results indicate that noise, alkali ion adducts, signal overlap, as well as low intensity protein charge states, can be neglected for preliminary experiments, as well as in screening assays. One comprehensive data analysis performed as a control should be sufficient to validate this hypothesis for other binding systems as well. Copyright © 2017 John Wiley & Sons, Ltd.
Developing a performance measurement approach to benefit/cost freight project prioritization.
DOT National Transportation Integrated Search
2014-10-01
Future reauthorizations of the federal transportation bill will require a comprehensive and quantitative analysis of the freight benefits : of proposed freight system projects. To prioritize public investments in freight systems and to insure conside...
Song, Hao; Wang, Hai-Yan; Zhang, Tao
2016-06-15
Larval metamorphosis of the veined rapa whelk (Rapana venosa) is a pelagic to benthic transition that involves considerable structural and physiological changes. Because metamorphosis plays a pivotal role in R. venosa commercial breeding and natural populations, the endogenous proteins that drive this transition attract considerable interest. This study is the first to perform a comprehensive and quantitative proteomic analysis related to metamorphosis in a marine gastropod. We analyzed the proteomes of competent R. venosa larvae and post-larvae, resulting in the identification of 5312 proteins, including 470 that were downregulated and 668 that were upregulated after metamorphosis. The differentially expressed proteins reflected multiple processes involved in metamorphosis, including cytoskeleton and cell adhesion, ingestion and digestion, stress response and immunity, as well as specific tissue development. Our data improve understanding of the physiological traits controlling R. venosa metamorphosis and provide a solid basis for further study.
MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*
Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying
2016-01-01
Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
A Re-Examination of the Education Production Function Using Individual Participant Data
ERIC Educational Resources Information Center
Pigott, Therese D.; Williams, Ryan T.; Polanin, Joshua R.
2011-01-01
The focus and purpose of this research is to examine the benefits, limitations, and implications of Individual Participant Data (IPD) meta-analysis in education. Comprehensive research reviews in education have been limited to the use of aggregated data (AD) meta- analysis, techniques based on quantitatively combining information from studies on…
A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism
ERIC Educational Resources Information Center
Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen
2011-01-01
Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…
Yang, Jie; Andric, Michael; Mathew, Mili M
2015-10-01
Gestures play an important role in face-to-face communication and have been increasingly studied via functional magnetic resonance imaging. Although a large amount of data has been provided to describe the neural substrates of gesture comprehension, these findings have never been quantitatively summarized and the conclusion is still unclear. This activation likelihood estimation meta-analysis investigated the brain networks underpinning gesture comprehension while considering the impact of gesture type (co-speech gestures vs. speech-independent gestures) and task demand (implicit vs. explicit) on the brain activation of gesture comprehension. The meta-analysis of 31 papers showed that as hand actions, gestures involve a perceptual-motor network important for action recognition. As meaningful symbols, gestures involve a semantic network for conceptual processing. Finally, during face-to-face interactions, gestures involve a network for social emotive processes. Our finding also indicated that gesture type and task demand influence the involvement of the brain networks during gesture comprehension. The results highlight the complexity of gesture comprehension, and suggest that future research is necessary to clarify the dynamic interactions among these networks. Copyright © 2015 Elsevier Ltd. All rights reserved.
CASTIN: a system for comprehensive analysis of cancer-stromal interactome.
Komura, Daisuke; Isagawa, Takayuki; Kishi, Kazuki; Suzuki, Ryohei; Sato, Reiko; Tanaka, Mariko; Katoh, Hiroto; Yamamoto, Shogo; Tatsuno, Kenji; Fukayama, Masashi; Aburatani, Hiroyuki; Ishikawa, Shumpei
2016-11-09
Cancer microenvironment plays a vital role in cancer development and progression, and cancer-stromal interactions have been recognized as important targets for cancer therapy. However, identifying relevant and druggable cancer-stromal interactions is challenging due to the lack of quantitative methods to analyze whole cancer-stromal interactome. We present CASTIN (CAncer-STromal INteractome analysis), a novel framework for the evaluation of cancer-stromal interactome from RNA-Seq data using cancer xenograft models. For each ligand-receptor interaction which is derived from curated protein-protein interaction database, CASTIN summarizes gene expression profiles of cancer and stroma into three evaluation indices. These indices provide quantitative evaluation and comprehensive visualization of interactome, and thus enable to identify critical cancer-microenvironment interactions, which would be potential drug targets. We applied CASTIN to the dataset of pancreas ductal adenocarcinoma, and successfully characterized the individual cancer in terms of cancer-stromal relationships, and identified both well-known and less-characterized druggable interactions. CASTIN provides comprehensive view of cancer-stromal interactome and is useful to identify critical interactions which may serve as potential drug targets in cancer-microenvironment. CASTIN is available at: http://github.com/tmd-gpat/CASTIN .
Sridharan, Sanjeev; Go, Sodam; Zinzow, Heidi; Gray, Aracelis; Barrett, Melissa Gutierrez
2007-02-01
In order to achieve the intended impact on a community, comprehensive community initiatives must sustain programs once they have been implemented. However, planning for sustainability is challenging and is rarely incorporated in the planning process of an initiative. The current study examined 19 5-year plans developed during the planning phase of the Comprehensive Strategy for Serious, Violent and Chronic Juvenile Offenders. Quantitative and qualitative methods were employed to assess the extent to which the construct of sustainability was incorporated. The plan analysis was supplemented with results from other components of the complex evaluation design implemented as part of the process evaluation of Comprehensive Strategy. Results suggested that sustainability was not accounted for during the planning phase of this initiative. The implications of these findings, including the importance of planning for sustainability in order to achieve sustainability, are discussed.
ERIC Educational Resources Information Center
Van Iddekinge, Chad H.; Roth, Philip L.; Putka, Dan J.; Lanivich, Stephen E.
2011-01-01
A common belief among researchers is that vocational interests have limited value for personnel selection. However, no comprehensive quantitative summaries of interests validity research have been conducted to substantiate claims for or against the use of interests. To help address this gap, we conducted a meta-analysis of relations between…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... Comprehensive Quantitative Impact Study.'' DATES: You should submit comments by March 26, 2010. ADDRESSES... requesting approval of the following new information collection: Title: Basel Comprehensive Quantitative... quantitative impact study (QIS) to assess the impact of the proposed revisions that were published by the Basel...
O'Maille, Grace; Go, Eden P.; Hoang, Linh; ...
2008-01-01
Comprehensive detection and quantitation of metabolites from a biological source constitute the major challenges of current metabolomics research. Two chemical derivatization methodologies, butylation and amination, were applied to human serum for ionization enhancement of a broad spectrum of metabolite classes, including steroids and amino acids. LC-ESI-MS analysis of the derivatized serum samples provided a significant signal elevation across the total ion chromatogram to over a 100-fold increase in ionization efficiency. It was also demonstrated that derivatization combined with isotopically labeled reagents facilitated the relative quantitation of derivatized metabolites from individual as well as pooled samples.
75 FR 18571 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... Comprehensive Quantitative Impact Study.'' The OCC has also given notice that it has sent this collection to OMB... following new information collection: Title: Basel Comprehensive Quantitative Impact Study. OMB Control No... the Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to...
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Quantitative Appearance Inspection for Film Coated Tablets.
Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru
2016-01-01
The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.
Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang
2017-05-01
A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.
Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa
2002-08-01
Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.
ERIC Educational Resources Information Center
Qian, Gaoyin
Some methodological issues in the study of levels of knowledge are reviewed, and needs for further research are explored, drawing on an analysis of 12 studies reported since the late 1970s. In the 12 studies, 16 quantitative experiments were conducted. These were assessed for internal and external validity. Analysis revealed some shortcomings in…
Selecting Cases for Intensive Analysis: A Diversity of Goals and Methods
ERIC Educational Resources Information Center
Gerring, John; Cojocaru, Lee
2016-01-01
This study revisits the task of case selection in case study research, proposing a new typology of strategies that is explicit, disaggregated, and relatively comprehensive. A secondary goal is to explore the prospects for case selection by "algorithm," aka "ex ante," "automatic," "quantitative,"…
A Meta-Analysis of the Taped Problems Intervention
ERIC Educational Resources Information Center
Kleinert, Whitney L.; Codding, Robin S.; Minami, Takuya; Gould, Kaitlin
2018-01-01
Taped problems is an intervention strategy for addressing mathematics fluency that has been evaluated in multiple single-case design studies. Although its efficacy has been supported in individual studies, no comprehensive quantitative synthesis has been conducted on taped problems. The purpose of this study was to synthesize the literature that…
USDA-ARS?s Scientific Manuscript database
Oxylipins, including eicosanoids, affect a broad range of biological processes, such as the initiation and resolution of inflammation. These compounds, also referred to as lipid mediators, are (non-) enzymatically generated by oxidation of polyunsaturated fatty acids such as arachidonic acid (AA). A...
USDA-ARS?s Scientific Manuscript database
Atlantic killifish (Fundulus heteroclitus) resident to some US urban and industrialized estuaries demonstrate recently evolved and extreme tolerance to toxic dioxin-like compounds (DLCs). Here we provide an unusually comprehensive accounting (69%) through Quantitative Trait Locus (QTL) analysis of ...
USDA-ARS?s Scientific Manuscript database
Atlantic killifish (Fundulus heteroclitus) resident to some US urban and industrialized estuaries demonstrate recently evolved and extreme tolerance to toxic dioxin-like compounds (DLCs). Here we provide an unusually comprehensive accounting (69%) through Quantitative Trait Locus (QTL) analysis of ...
Automated quantitative assessment of proteins' biological function in protein knowledge bases.
Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter
2008-01-01
Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.
Ramifications of increased training in quantitative methodology.
Zimiles, Herbert
2009-01-01
Comments on the article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America" by Aiken, West, and Millsap. The current author asks three questions that are provoked by the comprehensive identification of gaps and deficiencies in the training of quantitative methodology that led Aiken, West, and Millsap to call for expanded graduate instruction resources and programs. This comment calls for greater attention to how advances and expansion in the training of quantitative analysis are influencing who chooses to study psychology and how and what will be studied. PsycINFO Database Record 2009 APA.
Quantitative Analysis of the Efficiency of OLEDs.
Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo
2016-12-07
We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.
Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy
2011-08-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.
Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy
2011-01-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510
Chen, Lixun; Jiang, Ling; Shen, Aizong; Wei, Wei
2016-09-01
The frequently low quality of submitted spontaneous reports is of an increasing concern; to our knowledge, no validated instrument exists for assessing case reports' quality comprehensively enough. This work was conducted to develop such a quality instrument for assessing the spontaneous reports of adverse drug reaction (ADR)/adverse drug event (ADE) in China. Initial evaluation indicators were generated using systematic and literature data analysis. Final indicators and their weights were identified using Delphi method. The final quality instrument was developed by adopting the synthetic scoring method. A consensus was reached after four rounds of Delphi survey. The developed quality instrument consisted of 6 first-rank indicators, 18 second-rank indicators, and 115 third-rank indicators, and each rank indicator has been weighted. It evaluates the quality of spontaneous reports of ADR/ADE comprehensively and quantitatively on six parameters: authenticity, duplication, regulatory, completeness, vigilance level, and reporting time frame. The developed instrument was tested with good reliability and validity, which can be used to comprehensively and quantitatively assess the submitted spontaneous reports of ADR/ADE in China.
Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhe; Wu, Chaochao; Xie, Fang
Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less
Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes
Xu, Zhe; Wu, Chaochao; Xie, Fang; ...
2014-10-28
Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
Liu, Sifei; Zhang, Guangrui; Qiu, Ying; Wang, Xiaobo; Guo, Lihan; Zhao, Yanxin; Tong, Meng; Wei, Lan; Sun, Lixin
2016-12-01
In this study, we aimed to establish a comprehensive and practical quality evaluation system for Shenmaidihuang pills. A simple and reliable high-performance liquid chromatography coupled with photodiode array detection method was developed both for fingerprint analysis and quantitative determination. In fingerprint analysis, relative retention time and relative peak area were used to identify the common peaks in 18 samples for investigation. Twenty one peaks were selected as the common peaks to evaluate the similarities of 18 Shenmaidihuang pills samples with different manufacture dates. Furthermore, similarity analysis was applied to evaluate the similarity of samples. Hierarchical cluster analysis and principal component analysis were also performed to evaluate the variation of Shenmaidihuang pills. In quantitative analysis, linear regressions, injection precisions, recovery, repeatability and sample stability were all tested and good results were obtained to simultaneously determine the seven identified compounds, namely, 5-hydroxymethylfurfural, morroniside, loganin, paeonol, paeoniflorin, psoralen, isopsoralen in Shenmaidihuang pills. The contents of some analytes in different batches of samples indicated significant difference, especially for 5-hydroxymethylfurfural. So, it was concluded that the chromatographic fingerprint method obtained by high-performance liquid chromatography coupled with photodiode array detection associated with multiple compounds determination is a powerful and meaningful tool to comprehensively conduct the quality control of Shenmaidihuang pills. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NCI's Proteome Characterization Centers Announced | Office of Cancer Clinical Proteomics Research
The National Cancer Institute (NCI), part of the National Institutes of Health, announces the launch of a Clinical Proteomic Tumor Analysis Consortium (CPTAC). CPTAC is a comprehensive, coordinated team effort to accelerate the understanding of the molecular basis of cancer through the application of robust, quantitative, proteomic technologies and workflows.
Albach, Daniel; Chanteloup, Jean-Christophe
2015-01-12
A comprehensive experimental benchmarking of Yb(3+):YAG crystalline and co-sintered ceramic disks of similar thickness and doping level is presented in the context of high average power laser amplifier operation. Comparison is performed considering gain, depolarization and wave front deformation quantitative measurements and analysis.
USDA-ARS?s Scientific Manuscript database
More knowledge about diversity of Quantitative Trait Loci (QTL) controlling polygenic disease resistance in natural genetic variation of crop species is required for durably improving plant genetic resistances to pathogens. Polygenic partial resistance to Aphanomyces root rot, due to Aphanomcyces eu...
ERIC Educational Resources Information Center
Krausert, Christopher R.; Ying, Di; Zhang, Yu; Jiang, Jack J.
2011-01-01
Purpose: Digital kymography and vocal fold curve fitting are blended with detailed symmetry analysis of kymograms to provide a comprehensive characterization of the vibratory properties of injured vocal folds. Method: Vocal fold vibration of 12 excised canine larynges was recorded under uninjured, unilaterally injured, and bilaterally injured…
Factors Facilitating Implicit Learning: The Case of the Sesotho Passive
ERIC Educational Resources Information Center
Kline, Melissa; Demuth, Katherine
2010-01-01
Researchers have long debated the mechanisms underlying the learning of syntactic structure. Of significant interest has been the fact that passive constructions appear to be learned earlier in Sesotho than English. This paper provides a comprehensive, quantitative analysis of the passive input Sesotho-speaking children hear, how it differs from…
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
Buenrostro, Jason D.; Chircus, Lauren M.; Araya, Carlos L.; Layton, Curtis J.; Chang, Howard Y.; Snyder, Michael P.; Greenleaf, William J.
2015-01-01
RNA-protein interactions drive fundamental biological processes and are targets for molecular engineering, yet quantitative and comprehensive understanding of the sequence determinants of affinity remains limited. Here we repurpose a high-throughput sequencing instrument to quantitatively measure binding and dissociation of MS2 coat protein to >107 RNA targets generated on a flow-cell surface by in situ transcription and inter-molecular tethering of RNA to DNA. We decompose the binding energy contributions from primary and secondary RNA structure, finding that differences in affinity are often driven by sequence-specific changes in association rates. By analyzing the biophysical constraints and modeling mutational paths describing the molecular evolution of MS2 from low- to high-affinity hairpins, we quantify widespread molecular epistasis, and a long-hypothesized structure-dependent preference for G:U base pairs over C:A intermediates in evolutionary trajectories. Our results suggest that quantitative analysis of RNA on a massively parallel array (RNAMaP) relationships across molecular variants. PMID:24727714
Shellie, Robert; Marriott, Philip; Morrison, Paul
2004-09-01
The use of gas chromatography (GC)-mass spectrometry (MS), GC-time-of-flight MS (TOFMS), comprehensive two-dimensional GC (GCxGC)-flame ionization detection (FID), and GCxGC-TOFMS is discussed for the characterization of the eight important representative components, including Z-alpha-santalol, epi-alpha-bisabolol, Z-alpha-trans-bergamotol, epi-beta-santalol, Z-beta-santalol, E,E-farnesol, Z-nuciferol, and Z-lanceol, in the oil of west Australian sandalwood (Santalum spicatum). Single-column GC-MS lacks the resolving power to separate all of the listed components as pure peaks and allow precise analytical measurement of individual component abundances. With enhanced peak resolution capabilities in GCxGC, these components are sufficiently well resolved to be quantitated using flame ionization detection, following initial characterization of components by using GCxGC-TOFMS.
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
Egorov, Evgeny S; Merzlyak, Ekaterina M; Shelenkov, Andrew A; Britanova, Olga V; Sharonov, George V; Staroverov, Dmitriy B; Bolotin, Dmitriy A; Davydov, Alexey N; Barsova, Ekaterina; Lebedev, Yuriy B; Shugay, Mikhail; Chudakov, Dmitriy M
2015-06-15
Emerging high-throughput sequencing methods for the analyses of complex structure of TCR and BCR repertoires give a powerful impulse to adaptive immunity studies. However, there are still essential technical obstacles for performing a truly quantitative analysis. Specifically, it remains challenging to obtain comprehensive information on the clonal composition of small lymphocyte populations, such as Ag-specific, functional, or tissue-resident cell subsets isolated by sorting, microdissection, or fine needle aspirates. In this study, we report a robust approach based on unique molecular identifiers that allows profiling Ag receptors for several hundred to thousand lymphocytes while preserving qualitative and quantitative information on clonal composition of the sample. We also describe several general features regarding the data analysis with unique molecular identifiers that are critical for accurate counting of starting molecules in high-throughput sequencing applications. Copyright © 2015 by The American Association of Immunologists, Inc.
NASA Astrophysics Data System (ADS)
Zhao, Minzhi; Li, Haiyun; Liu, Xiaochen; Wei, Jie; Ji, Jianguo; Yang, Shu; Hu, Zhiyuan; Wei, Shicheng
2016-03-01
Nano-sized hydroxyapatite (n-HA) is considered as a bio-active material, which is often mixed into bone implant material, polyetheretherketone (PEEK). To reveal the global protein expression modulations of osteoblast in response to direct contact with the PEEK composite containing high level (40%) nano-sized hydroxyapatite (n-HA/PEEK) and explain its comprehensive bio-effects, quantitative proteomic analysis was conducted on human osteoblast-like cells MG-63 cultured on n-HA/PEEK in comparison with pure PEEK. Results from quantitative proteomic analysis showed that the most enriched categories in the up-regulated proteins were related to calcium ion processes and associated functions while the most enriched categories in the down-regulated proteins were related to RNA process. This enhanced our understanding to the molecular mechanism of the promotion of the cell adhesion and differentiation with the inhibition of the cell proliferation on n-HA/PEEK composite. It also exhibited that although the calcium ion level of incubate environment hadn’t increased, merely the calcium fixed on the surface of material had influence to intracellular calcium related processes, which was also reflect by the higher intracellular Ca2+ concentration of n-HA/PEEK. This study could lead to more comprehensive cognition to the versatile biocompatibility of composite materials. It further proves that proteomics is useful in new bio-effect discovery.
Zhao, Minzhi; Li, Haiyun; Liu, Xiaochen; Wei, Jie; Ji, Jianguo; Yang, Shu; Hu, Zhiyuan; Wei, Shicheng
2016-03-09
Nano-sized hydroxyapatite (n-HA) is considered as a bio-active material, which is often mixed into bone implant material, polyetheretherketone (PEEK). To reveal the global protein expression modulations of osteoblast in response to direct contact with the PEEK composite containing high level (40%) nano-sized hydroxyapatite (n-HA/PEEK) and explain its comprehensive bio-effects, quantitative proteomic analysis was conducted on human osteoblast-like cells MG-63 cultured on n-HA/PEEK in comparison with pure PEEK. Results from quantitative proteomic analysis showed that the most enriched categories in the up-regulated proteins were related to calcium ion processes and associated functions while the most enriched categories in the down-regulated proteins were related to RNA process. This enhanced our understanding to the molecular mechanism of the promotion of the cell adhesion and differentiation with the inhibition of the cell proliferation on n-HA/PEEK composite. It also exhibited that although the calcium ion level of incubate environment hadn't increased, merely the calcium fixed on the surface of material had influence to intracellular calcium related processes, which was also reflect by the higher intracellular Ca(2+) concentration of n-HA/PEEK. This study could lead to more comprehensive cognition to the versatile biocompatibility of composite materials. It further proves that proteomics is useful in new bio-effect discovery.
Wong, Tin-Long; An, Ya-Qi; Yan, Bing-Chao; Yue, Rui-Qi; Zhang, Tian-Bo; Ho, Hing-Man; Ren, Tian-Jing; Fung, Hau-Yee; Ma, Dik-Lung; Leung, Chung-Hang; Liu, Zhong-Liang; Pu, Jian-Xin; Han, Quan-Bin; Sun, Han-Dong
2016-06-05
YinHuang drop pill (YHDP) is a new preparation, derived from the traditional YinHuang (YH) decoction. Since drop pills are one of the newly developed forms of Chinese patent drugs, not much research has been done regarding the quality and efficacy. This study aims to establish a comprehensive quantitative analysis of the chemical profile of YHDP. ultra high-performance liquid chromatography quadrupole time of flight mass spectrometry (UHPLC-Q-TOF-MS/MS) was used to identify 34 non-sugar small molecules including 15 flavonoids, 9 phenolic acids, 5 saponins, 1 iridoid, and 4 iridoid glycosides in YHDP samples, and 26 of them were quantitatively determined. Sugar composition of YHDP in terms of fructose, glucose and sucrose was examined via a high performance liquid chromatography-evaporative light scattering detector on an amide column (HPLC-NH2P-ELSD). Macromolecules were examined by high performance gel permeation chromatography coupled with ELSD (HPGPC-ELSD). The content of the drop pill's skeleton component PEG-4000 was also quantified via ultra-high performance liquid chromatography coupled with charged aerosol detector (UHPLC-CAD). The results showed that up to 73% (w/w) of YHDP could be quantitatively determined. Small molecules accounted for approximately 5%, PEG-4000 represented 68%, while no sugars or macromolecules were found. Furthermore, YHDP showed no significant differences in terms of daily dosage, compared to YinHuang granules and YinHuang oral liquid; however, it has a higher small molecules content compared to YinHuang lozenge. Copyright © 2016 Elsevier B.V. All rights reserved.
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games
NASA Astrophysics Data System (ADS)
Zhongyou, Xing
The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.
Student Learning with Performance-Based, In-Class and Learner-Centered, Online Exams
ERIC Educational Resources Information Center
Greenberg, Katherine; Lester, Jessica N.; Evans, Kathy; Williams, Michele; Hacker, Carolyn; Halic, Olivia
2008-01-01
The purpose of this study was to explore the experience of students with performance-based, in-class and learner-centered, online assessment and the effects of these formats on comprehensive exam scores in an educational psychology course required of participants in a teacher education program. In our quantitative analysis, we investigated the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
ERIC Educational Resources Information Center
Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.
2014-01-01
We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…
DOE Office of Scientific and Technical Information (OSTI.GOV)
George A. Beitel
2004-02-01
In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less
Nonlinear optical microscopy and ultrasound imaging of human cervical structure
NASA Astrophysics Data System (ADS)
Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.; Hall, Timothy J.
2013-03-01
The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth.
Structure and Function of Iron-Loaded Synthetic Melanin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yiwen; Xie, Yijun; Wang, Zhao
We describe a synthetic method for increasing and controlling the iron loading of synthetic melanin nanoparticles and use the resulting materials to perform a systematic quantitative investigation on their structure- property relationship. A comprehensive analysis by magnetometry, electron paramagnetic resonance, and nuclear magnetic relaxation dispersion reveals the complexities of their magnetic behavior and how these intraparticle magnetic interactions manifest in useful material properties such as their performance as MRI contrast agents. This analysis allows predictions of the optimal iron loading through a quantitative modeling of antiferromagnetic coupling that arises from proximal iron ions. This study provides a detailed understanding ofmore » this complex class of synthetic biomaterials and gives insight into interactions and structures prevalent in naturally occurring melanins.« less
Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli
van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.
2016-01-01
Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438
Tebani, Abdellah; Afonso, Carlos; Bekri, Soumeya
2018-05-01
Metabolites are small molecules produced by enzymatic reactions in a given organism. Metabolomics or metabolic phenotyping is a well-established omics aimed at comprehensively assessing metabolites in biological systems. These comprehensive analyses use analytical platforms, mainly nuclear magnetic resonance spectroscopy and mass spectrometry, along with associated separation methods to gather qualitative and quantitative data. Metabolomics holistically evaluates biological systems in an unbiased, data-driven approach that may ultimately support generation of hypotheses. The approach inherently allows the molecular characterization of a biological sample with regard to both internal (genetics) and environmental (exosome, microbiome) influences. Metabolomics workflows are based on whether the investigator knows a priori what kind of metabolites to assess. Thus, a targeted metabolomics approach is defined as a quantitative analysis (absolute concentrations are determined) or a semiquantitative analysis (relative intensities are determined) of a set of metabolites that are possibly linked to common chemical classes or a selected metabolic pathway. An untargeted metabolomics approach is a semiquantitative analysis of the largest possible number of metabolites contained in a biological sample. This is part I of a review intending to give an overview of the state of the art of major metabolic phenotyping technologies. Furthermore, their inherent analytical advantages and limits regarding experimental design, sample handling, standardization and workflow challenges are discussed.
The Perseus computational platform for comprehensive analysis of (prote)omics data.
Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen
2016-09-01
A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He
2017-07-01
This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.
A methodological analysis of chaplaincy research: 2000-2009.
Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F
2011-01-01
The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.
NASA Astrophysics Data System (ADS)
Neiles, Kelly Y.
There is great concern in the scientific community that students in the United States, when compared with other countries, are falling behind in their scientific achievement. Increasing students' reading comprehension of scientific text may be one of the components involved in students' science achievement. To investigate students' reading comprehension this quantitative study examined the effects of different reader characteristics, namely, students' logical reasoning ability, factual chemistry knowledge, working memory capacity, and schema of the chemistry concepts, on reading comprehension of a chemistry text. Students' reading comprehension was measured through their ability to encode the text, access the meanings of words (lexical access), make bridging and elaborative inferences, and integrate the text with their existing schemas to make a lasting mental representation of the text (situational model). Students completed a series of tasks that measured the reader characteristic and reading comprehension variables. Some of the variables were measured using new technologies and software to investigate different cognitive processes. These technologies and software included eye tracking to investigate students' lexical accessing and a Pathfinder program to investigate students' schema of the chemistry concepts. The results from this study were analyzed using canonical correlation and regression analysis. The canonical correlation analysis allows for the ten variables described previously to be included in one multivariate analysis. Results indicate that the relationship between the reader characteristic variables and the reading comprehension variables is significant. The resulting canonical function accounts for a greater amount of variance in students' responses then any individual variable. Regression analysis was used to further investigate which reader characteristic variables accounted for the differences in students' responses for each reading comprehension variable. The results from this regression analysis indicated that the two schema measures (measured by the Pathfinder program) accounted for the greatest amount of variance in four of the reading comprehension variables (encoding the text, bridging and elaborative inferences, and delayed recall of a general summary). This research suggest that providing students with background information on chemistry concepts prior to having them read the text may result in better understanding and more effective incorporation of the chemistry concepts into their schema.
NASA Astrophysics Data System (ADS)
Clevers, Jan G. P. W.
2018-05-01
This book provides a comprehensive and timely overview on all aspects of hyperspectral remote sensing combined with various applications. As such, it is an excellent book of reference for both students and professionals active in the field of optical remote sensing. It deals with all aspects of retrieving quantitative information on biophysical properties of the Earth's surface, the data corrections needed and the range of analysis approaches available.
Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo
2015-04-01
This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis.
Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo
2015-01-01
This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis. PMID:25828383
Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.
Xia, Jianguo; Wishart, David S
2016-09-07
MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
The Comprehensive Benefit Evaluation of Take Shared Bicycles as Connecting to Public Transit
NASA Astrophysics Data System (ADS)
Zhang, J. Y.; Sun, H.; Li, P. F.; Li, C. C.
2017-10-01
Shared bicycles as an important way of connecting public transport, have few literature to evaluate its effectiveness.This paper taking Beijing city as an example, make anevaluationfor the six types of travel combinations which are commonly used by the citizens. The author selects four quantitative indicators: fuel consumption, economic cost, total time spent, and CO2 emission. And two qualitative indicators: degree of comfort and convenience. The relative efficiency of quantitative indicators is obtained by data envelopment analysis (DEA) and fuzzification and then take fuzzy synthetic evaluation with qualitative indicators.It was found that the choice of shared bicycles +subway+ shared bicycles and shared bicycles has good comprehensive benefits in medium distance travel. The findings also suggest that shared bicycles +subway+ shared bicycles is the best choice in the utilitarian trips. The conclusions not only provide suggestions for the travellers to select travel modes, but also can adjust the relevant factors to increase the proportion of green travel.
A comprehensive and quantitative exploration of thousands of viral genomes
Mahmoudabadi, Gita
2018-01-01
The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends – such as gene density, noncoding percentage, and abundances of functional gene categories – across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. PMID:29624169
A comprehensive and quantitative exploration of thousands of viral genomes.
Mahmoudabadi, Gita; Phillips, Rob
2018-04-19
The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends - such as gene density, noncoding percentage, and abundances of functional gene categories - across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. © 2018, Mahmoudabadi et al.
Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu
2018-06-01
Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.
NASA Astrophysics Data System (ADS)
Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli
2016-10-01
Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.
High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Plasma by LC-MS.
Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan
2017-01-01
Nonesterified fatty acids are important biological molecules which have multiple functions such as energy storage, gene regulation, or cell signaling. Comprehensive profiling of nonesterified fatty acids in biofluids can facilitate studying and understanding their roles in biological systems. For these reasons, we have developed and validated a high-throughput, nontargeted lipidomics method coupling liquid chromatography to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. Sufficient chromatographic separation is achieved to separate positional isomers such as polyunsaturated and branched-chain species and quantify a wide range of nonesterified fatty acids in human plasma samples. However, this method is not limited only to these fatty acid species and offers the possibility to perform untargeted screening of additional nonesterified fatty acid species.
Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites
NASA Astrophysics Data System (ADS)
Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.
2012-08-01
With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.
A preliminary study of DTI Fingerprinting on stroke analysis.
Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo
2014-01-01
DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.
Comprehensive lipid analysis: a powerful metanomic tool for predictive and diagnostic medicine.
Watkins, S M
2000-09-01
The power and accuracy of predictive diagnostics stand to improve dramatically as a result of lipid metanomics. The high definition of data obtained with this approach allows multiple rather than single metabolites to be used in markers for a group. Since as many as 40 fatty acids are quantified from each lipid class, and up to 15 lipid classes can be quantified easily, more than 600 individual lipid metabolites can be measured routinely for each sample. Because these analyses are comprehensive, only the most appropriate and unique metabolites are selected for their predictive value. Thus, comprehensive lipid analysis promises to greatly improve predictive diagnostics for phenotypes that directly or peripherally involve lipids. A broader and possibly more exciting aspect of this technology is the generation of metabolic profiles that are not simply markers for disease, but metabolic maps that can be used to identify specific genes or activities that cause or influence the disease state. Metanomics is, in essence, functional genomics from metabolite analysis. By defining the metabolic basis for phenotype, researchers and clinicians will have an extraordinary opportunity to understand and treat disease. Much in the same way that gene chips allow researchers to observe the complex expression response to a stimulus, metanomics will enable researchers to observe the complex metabolic interplay responsible for defining phenotype. By extending this approach beyond the observation of individual dysregulations, medicine will begin to profile not single diseases, but health. As health is the proper balance of all vital metabolic pathways, comprehensive or metanomic analysis lends itself very well to identifying the metabolite distributions necessary for optimum health. Comprehensive and quantitative analysis of lipids would provide this degree of diagnostic power to researchers and clinicians interested in mining metabolic profiles for biological meaning.
ERIC Educational Resources Information Center
Incik, Eylem Yalcin; Akay, Cenk
2017-01-01
The main purpose of this research is to identify the competency and the perceptions of pre-service teachers on technopedagogical education. While study group of quantitative dimension is composed of 626 pre-service teachers; study group of qualitative extent is composed of 67 pre-service teachers. The sample represents 35% of the population of…
Sauer, Eva; Reinke, Ann-Kathrin; Courts, Cornelius
2016-05-01
Applying molecular genetic approaches for the identification of forensically relevant body fluids, which often yield crucial information for the reconstruction of a potential crime, is a current topic of forensic research. Due to their body fluid specific expression patterns and stability against degradation, microRNAs (miRNA) emerged as a promising molecular species, with a range of candidate markers published. The analysis of miRNA via quantitative Real-Time PCR, however, should be based on a relevant strategy of normalization of non-biological variances to deliver reliable and biologically meaningful results. The herein presented work is the as yet most comprehensive study of forensic body fluid identification via miRNA expression analysis based on a thoroughly validated qPCR procedure and unbiased statistical decision making to identify single source samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Comprehensive evaluation of impacts of distributed generation integration in distribution network
NASA Astrophysics Data System (ADS)
Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu
2018-04-01
All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.
Sachse, F. B.
2015-01-01
Microstructural characterization of cardiac tissue and its remodeling in disease is a crucial step in many basic research projects. We present a comprehensive approach for three-dimensional characterization of cardiac tissue at the submicrometer scale. We developed a compression-free mounting method as well as labeling and imaging protocols that facilitate acquisition of three-dimensional image stacks with scanning confocal microscopy. We evaluated the approach with normal and infarcted ventricular tissue. We used the acquired image stacks for segmentation, quantitative analysis and visualization of important tissue components. In contrast to conventional mounting, compression-free mounting preserved cell shapes, capillary lumens and extracellular laminas. Furthermore, the new approach and imaging protocols resulted in high signal-to-noise ratios at depths up to 60 μm. This allowed extensive analyses revealing major differences in volume fractions and distribution of cardiomyocytes, blood vessels, fibroblasts, myofibroblasts and extracellular space in control versus infarct border zone. Our results show that the developed approach yields comprehensive data on microstructure of cardiac tissue and its remodeling in disease. In contrast to other approaches, it allows quantitative assessment of all major tissue components. Furthermore, we suggest that the approach will provide important data for physiological models of cardiac tissue at the submicrometer scale. PMID:26399990
Nonlinear optical microscopy and ultrasound imaging of human cervical structure
Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.
2013-01-01
Abstract. The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth. PMID:23412434
Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan
2014-05-30
We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.
[Recent advances in metabonomics].
Xu, Guo-Wang; Lu, Xin; Yang, Sheng-Li
2007-12-01
Metabonomics (or metabolomics) aims at the comprehensive and quantitative analysis of the wide arrays of metabolites in biological samples. Metabonomics has been labeled as one of the new" -omics" joining genomics, transcriptomics, and proteomics as a science employed toward the understanding of global systems biology. It has been widely applied in many research areas including drug toxicology, biomarker discovery, functional genomics, and molecular pathology etc. The comprehensive analysis of the metabonome is particularly challenging due to the diverse chemical natures of metabolites. Metabonomics investigations require special approaches for sample preparation, data-rich analytical chemical measurements, and information mining. The outputs from a metabonomics study allow sample classification, biomarker discovery, and interpretation of the reasons for classification information. This review focuses on the currently new advances in various technical platforms of metabonomics and its applications in drug discovery and development, disease biomarker identification, plant and microbe related fields.
Comprehensive proteomic analysis of Penicillium verrucosum.
Nöbauer, Katharina; Hummel, Karin; Mayrhofer, Corina; Ahrens, Maike; Setyabudi, Francis M C; Schmidt-Heydt, Markus; Eisenacher, Martin; Razzazi-Fazeli, Ebrahim
2017-05-01
Mass spectrometric identification of proteins in species lacking validated sequence information is a major problem in veterinary science. In the present study, we used ochratoxin A producing Penicillium verrucosum to identify and quantitatively analyze proteins of an organism with yet no protein information available. The work presented here aimed to provide a comprehensive protein identification of P. verrucosum using shotgun proteomics. We were able to identify 3631 proteins in an "ab initio" translated database from DNA sequences of P. verrucosum. Additionally, a sequential window acquisition of all theoretical fragment-ion spectra analysis was done to find differentially regulated proteins at two different time points of the growth curve. We compared the proteins at the beginning (day 3) and at the end of the log phase (day 12). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2016-12-01
Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.
Li, Haijing; Zhang, Xiangwen
2017-08-08
As coking precursors, aromatic hydrocarbons have an effect on the cracking stability of fuels. A method for identifying and quantitating aromatics in the supercritical cracking products of jet fuel was established by comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS). The effects of main chromatographic conditions such as initial oven temperature and modulation period on the separation of supercritical cracking products were studied. The method has good separation ability for polycyclic aromatic hydrocarbons (PAH) isomers. A total of 27 aromatics, including monocyclic aromatic hydrocarbons, bicyclic aromatic hydrocarbons, tricyclic aromatic hydrocarbons, tetracyclic aromatic hydrocarbons, etc., were identified based on standard mass spectra, the retention times of standards and literature reports. Moreover, the corresponding quantitative determination was achieved by external standard method of GC×GC-FID. The results showed that the contents of aromatics increased with the increase of gas yield. When gas yield reached 22%, the bicyclic aromatic hydrocarbons began to produce, and their contents increased exponentially with the increase of gas yield. Compared with the traditional GC-MS, the method has better separation and qualitative ability, and can be applied to the separation of complex samples and qualitative and quantitative analyses of cracking products.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
A Comparative Study of Hawaii Middle School Science Student Academic Achievement
NASA Astrophysics Data System (ADS)
Askew Cain, Peggy
The problem was middle-grade students with specific learning disabilities (SWDs) in reading comprehension perform less well than their peers on standardized assessments. The purpose of this quantitative comparative study was to examine the effect of electronic concept maps on reading comprehension of eighth grade students with SWD reading comprehension in a Hawaii middle school Grade 8 science class on the island of Oahu. The target population consisted of Grade 8 science students for school year 2015-2016. The sampling method was a purposeful sampling with a final sample size of 338 grade 8 science students. De-identified archival records of grade 8 Hawaii standardized science test scores were analyzed using a one way analysis of variance (ANOVA) in SPSS. The finding for hypothesis 1 indicated a significant difference in student achievement between SWDs and SWODs as measured by Hawaii State Assessment (HSA) in science scores (p < 0.05), and for hypothesis 2, a significant difference in instructional modality for SWDs who used concept maps and does who did not as measured by the Hawaii State Assessment in science (p < 0.05). The implications of the findings (a) SWDs performed less well in science achievement than their peers and consequently, and (b) SWODs appeared to remember greater degrees of science knowledge, and answered more questions correctly than SWDs as a result of reading comprehension. Recommendations for practice were for educational leadership and noted: (a) teachers should practice using concept maps with SWDs as a specific reading strategy to support reading comprehension in science classes, (b) involve a strong focus on vocabulary building and concept building during concept map construction because the construction of concept maps sometimes requires frontloading of vocabulary, and (c) model for teachers how concept maps are created and to explain their educational purpose as a tool for learning. Recommendations for future research were to conduct (a) a quantitative comparative study between groups for academic achievement of subtests mean scores of SWDs and SWODs in physical science, earth science, and space science, and (b) a quantitative correlation study to examine relationships and predictive values for academic achievement of SWDs and concept map integration on standardized science assessments.
Comprehensive analysis of "bath salts" purchased from California stores and the internet.
Schneir, A; Ly, B T; Casagrande, K; Darracq, M; Offerman, S R; Thornton, S; Smollin, C; Vohra, R; Rangun, C; Tomaszewski, C; Gerona, R R
2014-08-01
To analyze the contents of "bath salt" products purchased from California stores and the Internet qualitatively and quantitatively in a comprehensive manner. A convenience sample of "bath salt" products were purchased in person by multiple authors at retail stores in six California cities and over the Internet (U.S. sites only), between August 11, 2011 and December 15, 2011. Liquid chromatography-time-of-flight mass spectrometry was utilized to identify and quantify all substances in the purchased products. Thirty-five "bath salt" products were purchased and analyzed. Prices ranged from $9.95 to 49.99 (U.S. dollars). Most products had a warning against use. The majority (32/35, 91%) had one (n = 15) or multiple cathinones (n = 17) present. Fourteen different cathinones were identified, 3,4-methylenedioxypyrovalerone (MDPV) being the most common. Multiple drugs found including cathinones (buphedrone, ethcathinone, ethylone, MDPBP, and PBP), other designer amines (ethylamphetamine, fluoramphetamine, and 5-IAI), and the antihistamine doxylamine had not been previously identified in U.S. "bath salt" products. Quantification revealed high stimulant content and in some cases dramatic differences in either total cathinone or synthetic stimulant content between products with the same declared weight and even between identically named and outwardly appearing products. Comprehensive analysis of "bath salts" purchased from California stores and the Internet revealed the products to consistently contain cathinones, alone, or in different combinations, sometimes in high quantity. Multiple cathinones and other drugs found had not been previously identified in U.S. "bath salt" products. High total stimulant content in some products and variable qualitative and quantitative composition amongst products were demonstrated.
Osago, Harumi; Shibata, Tomoko; Hara, Nobumasa; Kuwata, Suguru; Kono, Michihaya; Uchio, Yuji; Tsuchiya, Mikako
2014-12-15
We developed a method using liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) with a selected reaction monitoring (SRM) mode for simultaneous quantitative analysis of glycosaminoglycans (GAGs). Using one-shot analysis with our MS/MS method, we demonstrated the simultaneous quantification of a total of 23 variously sulfated disaccharides of four GAG classes (8 chondroitin/dermatan sulfates, 1 hyaluronic acid, 12 heparan sulfates, and 2 keratan sulfates) with a sensitivity of less than 0.5 pmol within 20 min. We showed the differences in the composition of GAG classes and the sulfation patterns between porcine articular cartilage and yellow ligament. In addition to the internal disaccharides described above, some saccharides derived from the nonreducing terminal were detected simultaneously. The simultaneous quantification of both internal and nonreducing terminal saccharides could be useful to estimate the chain length of GAGs. This method would help to establish comprehensive "GAGomic" analysis of biological tissues. Copyright © 2014 Elsevier Inc. All rights reserved.
2014-05-13
the information needed to effectively (1) manage its assets, (2) assess program performance and make budget decisions , (3) make cost- effective ... decision making, including the information needed to effectively (1) manage its assets, (2) assess program performance and make budget decisions , (3...incorporating key elements of a comprehensive management approach , such as a complete analysis of the return on investment, quantitatively -defined goals
Building Capacity in the Public Utility Sectors of Basra, Iraq
2007-10-01
has a broad range of academic and personal interests in fields that use the creative application of quantitative analysis and systems thinking to solve ... comprehensive study by Tokyo Engineering Consultants Co., Ltd. [5] references a joint United Nations Development Program (UNDP) Iraqi Central Organization...creative solutions to solve this problem. We then develop a means to evaluate and rank these potential solutions using the stakeholder values as the
Defence Reporter. Summer/Autumn 2011
2011-01-01
of protection afforded by this material was determined against a liquid challenge. R000259AA Cermet Materials for Armour Applications Dstl Porton...initial research, however areas of focus have been identified and a comprehensive plan of study has been developed. R000261FE Force Multipliers in... study was not intended to be a rigorous quantitative analysis of COIN. It has identified six factors that have had an overall positive effect upon the
Integration of PKPD relationships into benefit–risk analysis
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-01-01
Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398
Integration of PKPD relationships into benefit-risk analysis.
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-11-01
Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.
Systems-Level Analysis of Innate Immunity
Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan
2014-01-01
Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298
Comprehensive Analysis of CBFβ-MYH11 Fusion Transcripts in Acute Myeloid Leukemia by RT-PCR Analysis
Kadkol, ShriHari S.; Bruno, Annette; Dodge, Carol; Lindgren, Valerie; Ravandi, Farhad
2004-01-01
CBFβ-MYH11 fusion transcripts are expressed in acute myeloid leukemias of the M4Eo subtype. Patients who express CBFβ-MYH11 fusion transcripts respond favorably to high-dose chemotherapy and are generally spared allogeneic bone marrow transplantation. Hence it is important to identify this fusion in all patients with acute myeloid leukemia M4Eo leukemia. The fusion can be detected by cytogenetics, fluorescence in-situ hybridization (FISH), or by molecular analysis with RT-PCR. Multiple fusion transcripts arising as a result of various breakpoints in the CBFβ and MYH11 have been identified. In this report we describe a comprehensive RT-PCR assay to identify all known fusion transcripts and provide an algorithm for molecular analysis of CBFβ-MYH11 fusions from patient specimens. Further, identification of the fusion transcript by such an assay would help in the diagnosis and follow up of patients with cryptic inversion 16 translocations (such as patient 2 in this report) not detected by standard cytogenetics or FISH and for rational design of probes for quantitative analysis by real-time PCR. PMID:14736823
Smile line assessment comparing quantitative measurement and visual estimation.
Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie
2011-02-01
Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Language Skills in Classical Chinese Text Comprehension
ERIC Educational Resources Information Center
Lau, Kit-ling
2018-01-01
This study used both quantitative and qualitative methods to explore the role of lower- and higher-level language skills in classical Chinese (CC) text comprehension. A CC word and sentence translation test, text comprehension test, and questionnaire were administered to 393 Secondary Four students; and 12 of these were randomly selected to…
Wei, Fang; Hu, Na; Lv, Xin; Dong, Xu-Yan; Chen, Hong
2015-07-24
In this investigation, off-line comprehensive two-dimensional liquid chromatography-atmospheric pressure chemical ionization mass spectrometry using a single column has been applied for the identification and quantification of triacylglycerols in edible oils. A novel mixed-mode phenyl-hexyl chromatographic column was employed in this off-line two-dimensional separation system. The phenyl-hexyl column combined the features of traditional C18 and silver-ion columns, which could provide hydrophobic interactions with triacylglycerols under acetonitrile conditions and can offer π-π interactions with triacylglycerols under methanol conditions. When compared with traditional off-line comprehensive two-dimensional liquid chromatography employing two different chromatographic columns (C18 and silver-ion column) and using elution solvents comprised of two phases (reversed-phase/normal-phase) for triacylglycerols separation, the novel off-line comprehensive two-dimensional liquid chromatography using a single column can be achieved by simply altering the mobile phase between acetonitrile and methanol, which exhibited a much higher selectivity for the separation of triacylglycerols with great efficiency and rapid speed. In addition, an approach based on the use of response factor with atmospheric pressure chemical ionization mass spectrometry has been developed for triacylglycerols quantification. Due to the differences between saturated and unsaturated acyl chains, the use of response factors significantly improves the quantitation of triacylglycerols. This two-dimensional liquid chromatography-mass spectrometry system was successfully applied for the profiling of triacylglycerols in soybean oils, peanut oils and lord oils. A total of 68 triacylglycerols including 40 triacylglycerols in soybean oils, 50 triacylglycerols in peanut oils and 44 triacylglycerols in lord oils have been identified and quantified. The liquid chromatography-mass spectrometry data were analyzed using principal component analysis. The results of the principal component analysis enabled a clear identification of different plant oils. By using this two-dimensional liquid chromatography-mass spectrometry system coupled with principal component analysis, adulterated soybean oils with 5% added lord oil and peanut oils with 5% added soybean oil can be clearly identified. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui
2017-03-01
As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.
An Overview of State Policies Supporting Worksite Health Promotion Programs.
VanderVeur, Jennifer; Gilchrist, Siobhan; Matson-Koffman, Dyann
2017-05-01
Worksite health promotion (WHP) programs can reduce the occurrence of cardiovascular disease risk factors. State law can encourage employers and employer-provided insurance companies to offer comprehensive WHP programs. This research examines state law authorizing WHP programs. Quantitative content analysis. Worksites or workplaces. United States (and the District of Columbia). State law in effect in 2013 authorizing WHP programs. Frequency and distribution of states with WHP laws. To determine the content of the laws for analysis and coding, we identified 18 policy elements, 12 from the Centers for Disease Control and Prevention's Worksite Health ScoreCard (HSC) and 6 additional supportive WHP strategies. We used these strategies as key words to search for laws authorizing WHP programs or select WHP elements. We calculated the number and type of WHP elements for each state with WHP laws and selected two case examples from states with comprehensive WHP laws. Twenty-four states authorized onsite WHP programs, 29 authorized WHP through employer-provided insurance plans, and 18 authorized both. Seven states had a comprehensive WHP strategy, addressing 8 or more of 12 HSC elements. The most common HSC elements were weight management, tobacco cessation, and physical activity. Most states had laws encouraging the adoption of WHP programs. Massachusetts and Maine are implementing comprehensive WHP laws but studies evaluating their health impact are needed.
Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2015-09-18
In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ignatov, D.; Zhurbina, N.; Gerasimenko, A.
2017-01-01
3-D composites are widely used in tissue engineering. A comprehensive analysis by X-ray microtomography was conducted to study the structure of the 3-D composites. Comprehensive analysis of the structure of the 3-D composites consisted of scanning, image reconstruction of shadow projections, two-dimensional and three-dimensional visualization of the reconstructed images and quantitative analysis of the samples. Experimental samples of composites were formed by laser vaporization of the aqueous dispersion BSA and single-walled (SWCNTs) and multi-layer (MWCNTs) carbon nanotubes. The samples have a homogeneous structure over the entire volume, the percentage of porosity of 3-D composites based on SWCNTs and MWCNTs - 16.44%, 28.31%, respectively. An average pore diameter of 3-D composites based on SWCNTs and MWCNTs - 45 μm 93 μm. 3-D composites based on carbon nanotubes in bovine serum albumin matrix can be used in tissue engineering of bone and cartilage, providing cell proliferation and blood vessel sprouting.
Tan, M L H; Kok, K; Ganesh, V; Thomas, S S
2014-02-01
Breast cancer patient's expectation and choice of reconstruction is increasing and patients often satisfy their information needs outside clinic time by searching the world wide web. The aim of our study was to analyse the quality of content and extent of information regarding breast reconstruction available on YouTube videos and whether this is an appropriate additional source of information for patients. A snapshot qualitative and quantitative analysis of the first 100 videos was performed after the term 'breast reconstruction' was input into the search window of the video sharing website www.youtube.com on the 1st of September 2011. Qualitative categorical analysis included patient, oncological and reconstruction factors. It was concluded that although videos uploaded onto YouTube do not provide comprehensive information, it is a useful resource that can be utilised in patient education provided comprehensive and validated videos are made available. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ardiansah; Masykuri, M.; Rahardjo, S. B.
2018-05-01
Students’ conceptual understanding is the most important comprehension to obtain related comprehension. However, they held their own conception. With this need analysis, we will elicit student need of 3TMC diagnostic test to measure students’ conception about acid-base and solubility equilibrium. The research done by a mixed method using questionnaire analysis based on descriptive of quantitative and qualitative. The research subject was 96 students from 4 senior high schools and 4 chemistry teachers chosen by random sampling technique. Data gathering used a questionnaire with 10 questions for student and 28 questions for teachers. The results showed that 97% of students stated that the development this instrument is needed. In addition, there were several problems obtained in this questionnaire include learning activity, teacher’s test and guessing. In conclusion, this is necessary to develop the 3TMC instrument that can diagnose and measure the student’s conception in acid-base and solubility equilibrium.
Methods for heat transfer and temperature field analysis of the insulated diesel
NASA Technical Reports Server (NTRS)
Morel, T.; Blumberg, P. N.; Fort, E. F.; Keribar, R.
1984-01-01
Work done during phase 1 of a three-year program aimed at developing a comprehensive heat transfer and thermal analysis methodology oriented specifically to the design requirements of insulated diesel engines is reported. The technology developed in this program makes possible a quantitative analysis of the low heat rejection concept. The program is comprehensive in that it addresses all the heat transfer issues that are critical to the successful development of the low heat rejection diesel engine: (1) in-cylinder convective and radiative heat transfer; (2) cyclic transient heat transfer in thin solid layers at component surfaces adjacent to the combustion chamber; and (3) steady-state heat conduction in the overall engine structure. The Integral Technologies, Inc. (ITI) program is comprised of a set of integrated analytical and experimental tasks. A detailed review of the ITI program approach is provided, including the technical issues which underlie it and a summay of the methods that were developed.
A multi-method approach toward de novo glycan characterization: a Man-5 case study.
Prien, Justin M; Prater, Bradley D; Cockrill, Steven L
2010-05-01
Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.
To label or not to label: applications of quantitative proteomics in neuroscience research.
Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W
2012-02-01
Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Photometric Determination of Ammonium and Phosphate in Seawater Medium Using a Microplate Reader.
Ruppersberg, Hanna S; Goebel, Maren R; Kleinert, Svea I; Wünsch, Daniel; Trautwein, Kathleen; Rabus, Ralf
2017-01-01
To more efficiently process the large sample numbers for quantitative determination of ammonium (NH4+) and phosphate (orthophosphate, PO43-) generated during comprehensive growth experiments with the marine Roseobacter group member Phaeobacter inhibens DSM 17395, specific colorimetric assays employing a microplate reader (MPR) were established. The NH4+ assay is based on the reaction of NH4+ with hypochlorite and salicylate, yielding a limit of detection of 14 µM, a limit of quantitation of 36 µM, and a linear range for quantitative determination up to 200 µM. The PO43-assay is based on the complex formation of PO43- with ammonium molybdate in the presence of ascorbate and zinc acetate, yielding a limit of detection of 13 µM, a limit of quantitation of 50 µM, and a linear range for quantitative determination up to 1 mM. Both MPR-based assays allowed for fast (significantly lower than 1 h) analysis of 21 samples plus standards for calibration (all measured in triplicates) and showed only low variation across a large collection of biological samples. © 2017 S. Karger AG, Basel.
Ouyang, Yilan; Zeng, Yangyang; Yi, Lin; Tang, Hong; Li, Duxin; Linhardt, Robert J; Zhang, Zhenqing
2017-11-03
Heparin, a highly sulfated glycosaminoglycan, has been used as a clinical anticoagulant over 80 years. Low molecular weight heparins (LMWHs), heparins partially depolymerized using different processes, are widely used as clinical anticoagulants. Qualitative molecular weight (MW) and quantitative mass content analysis are two important factors that contribute to LMWH quality control. Size exclusion chromatography (SEC), relying on multiple angle laser scattering (MALS)/refractive index (RI) detectors, has been developed for accurate analysis of heparin MW in the absence of standards. However, the cations, which ion-pair with the anionic polysaccharide chains of heparin and LMWHs, had not been considered in previous reports. In this study, SEC with MALS/RI and inductively coupled plasma/mass spectrometry detectors were used in a comprehensive analytical approach taking both anionic polysaccharide and ion-paired cations heparin products. This approach was also applied to quantitative analysis of heparin and LMWHs. Full profiles of MWs and mass recoveries for three commercial heparin/LMWH products, heparin sodium, enoxaparin sodium and nadroparin calcium, were obtained and all showed higher MWs than previously reported. This important improvement more precisely characterized the MW properties of heparin/LMWHs and potentially many other anionic polysaccharides. Copyright © 2017 Elsevier B.V. All rights reserved.
Ristov, Strahil; Brajkovic, Vladimir; Cubric-Curik, Vlatka; Michieli, Ivan; Curik, Ino
2016-09-10
Identification of genes or even nucleotides that are responsible for quantitative and adaptive trait variation is a difficult task due to the complex interdependence between a large number of genetic and environmental factors. The polymorphism of the mitogenome is one of the factors that can contribute to quantitative trait variation. However, the effects of the mitogenome have not been comprehensively studied, since large numbers of mitogenome sequences and recorded phenotypes are required to reach the adequate power of analysis. Current research in our group focuses on acquiring the necessary mitochondria sequence information and analysing its influence on the phenotype of a quantitative trait. To facilitate these tasks we have produced software for processing pedigrees that is optimised for maternal lineage analysis. We present MaGelLAn 1.0 (maternal genealogy lineage analyser), a suite of four Python scripts (modules) that is designed to facilitate the analysis of the impact of mitogenome polymorphism on quantitative trait variation by combining molecular and pedigree information. MaGelLAn 1.0 is primarily used to: (1) optimise the sampling strategy for molecular analyses; (2) identify and correct pedigree inconsistencies; and (3) identify maternal lineages and assign the corresponding mitogenome sequences to all individuals in the pedigree, this information being used as input to any of the standard software for quantitative genetic (association) analysis. In addition, MaGelLAn 1.0 allows computing the mitogenome (maternal) effective population sizes and probability of mitogenome (maternal) identity that are useful for conservation management of small populations. MaGelLAn is the first tool for pedigree analysis that focuses on quantitative genetic analyses of mitogenome data. It is conceived with the purpose to significantly reduce the effort in handling and preparing large pedigrees for processing the information linked to maternal lines. The software source code, along with the manual and the example files can be downloaded at http://lissp.irb.hr/software/magellan-1-0/ and https://github.com/sristov/magellan .
Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong
2014-07-01
Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Accardo, Amy L.; Finnegan, Elizabeth G.; Gulkus, Steven P.; Papay, Clare K.
2017-01-01
Learners with autism spectrum disorder (ASD) often exhibit difficulty in the area of reading comprehension. Research connecting the learning needs of individuals with ASD, existing effective practices, teacher training, and teacher perceptions of their own ability to teach reading comprehension is scarce. Quantitative survey methodology and…
Supporting Social Studies Reading Comprehension with an Electronic Pop-Up Dictionary
ERIC Educational Resources Information Center
Fry, Sara Winstead; Gosky, Ross
2008-01-01
This study investigated how middle school students' comprehension was impacted by reading social studies texts online with a pop-up dictionary function for every word in the text. A quantitative counterbalance design was used to determine how 129 middle school students' reading comprehension test scores for the pop-up dictionary reading differed…
Van Oudenhove, Laurence; Devreese, Bart
2013-06-01
Proteomics has evolved substantially since its early days, some 20 years ago. In this mini-review, we aim to provide an overview of general methodologies and more recent developments in mass spectrometric approaches used for relative and absolute quantitation of proteins. Enhancement of sensitivity of the mass spectrometers as well as improved sample preparation and protein fractionation methods are resulting in a more comprehensive analysis of proteomes. We also document some upcoming trends for quantitative proteomics such as the use of label-free quantification methods. Hopefully, microbiologists will continue to explore proteomics as a tool in their research to understand the adaptation of microorganisms to their ever changing environment. We encourage them to incorporate some of the described new developments in mass spectrometry to facilitate their analyses and improve the general knowledge of the fascinating world of microorganisms.
Global, quantitative and dynamic mapping of protein subcellular localization.
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh
2016-06-09
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.
Geospatial Resource Access Analysis In Hedaru, Tanzania
NASA Astrophysics Data System (ADS)
Clark, Dylan G.; Premkumar, Deepak; Mazur, Robert; Kisimbo, Elibariki
2013-12-01
Populations around the world are facing increased impacts of anthropogenic-induced environmental changes and rapid population movements. These environmental and social shifts are having an elevated impact on the livelihoods of agriculturalists and pastoralists in developing countries. This appraisal integrates various tools—usually used independently— to gain a comprehensive understanding of the regional livelihood constraints in the rural Hedaru Valley of northeastern Tanzania. Conducted in three villages with different natural resources, using three primary methods: 1) participatory mapping of infrastructures; 2) administration of quantitative, spatially-tied surveys (n=80) and focus groups (n=14) that examined land use, household health, education, and demographics; 3) conducting quantitative time series analysis of Landsat- based Normalized Difference Vegetation Index images. Through various geospatial and multivariate linear regression analyses, significant geospatial trends emerged. This research added to the academic understanding of the region while establishing pathways for climate change adaptation strategies.
NBSGSC - a FORTRAN program for quantitative x-ray fluorescence analysis. Technical note (final)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao, G.Y.; Pella, P.A.; Rousseau, R.M.
1985-04-01
A FORTRAN program (NBSGSC) was developed for performing quantitative analysis of bulk specimens by x-ray fluorescence spectrometry. This program corrects for x-ray absorption/enhancement phenomena using the comprehensive alpha coefficient algorithm proposed by Lachance (COLA). NBSGSC is a revision of the program ALPHA and CARECAL originally developed by R.M. Rousseau of the Geological Survey of Canada. Part one of the program (CALCO) performs the calculation of theoretical alpha coefficients, and part two (CALCOMP) computes the composition of the analyte specimens. The analysis of alloys, pressed minerals, and fused specimens can currently be treated by the program. In addition to using measuredmore » x-ray tube spectral distributions, spectra from seven commonly used x-ray tube targets could also be calculated with an NBS algorithm included in the program. NBSGSC is written in FORTRAN IV for a Digital Equipment Corporation (DEC PDP-11/23) minicomputer using RLO2 firm disks and an RSX 11M operating system.« less
[Urban ecological land in Changsha City: its quantitative analysis and optimization].
Li, Xiao-Li; Zeng, Guang-Ming; Shi, Lin; Liang, Jie; Cai, Qing
2010-02-01
In this paper, a hierarchy index system suitable for catastrophe progression method was constructed to comprehensively analyze and evaluate the status of ecological land construction in Changsha City in 2007. Based on the evaluation results, the irrationalities of the distribution pattern of Changsha urban ecological land were discussed. With the support of geographic information system (GIS), the ecological corridors of the urban ecological land were constructed by using the 'least-cost' modeling, and, in combining with conflict analysis, the optimum project of the urban ecological land was put forward, forming an integrated evaluation system. The results indicated that the ecological efficiency of urban ecological land in Changsha in 2007 was at medium level, with an evaluation value being 0.9416, and the quantitative index being relatively high but the coordination index being relatively low. The analysis and verification with software Fragstats showed that the ecological efficiency of the urban ecological land after optimization was higher, with the evaluation value being 0.9618, and the SHDI, CONTAG, and other indices also enhanced.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Complex and dynamic landscape of RNA polyadenylation revealed by PAS-Seq
Shepard, Peter J.; Choi, Eun-A; Lu, Jente; Flanagan, Lisa A.; Hertel, Klemens J.; Shi, Yongsheng
2011-01-01
Alternative polyadenylation (APA) of mRNAs has emerged as an important mechanism for post-transcriptional gene regulation in higher eukaryotes. Although microarrays have recently been used to characterize APA globally, they have a number of serious limitations that prevents comprehensive and highly quantitative analysis. To better characterize APA and its regulation, we have developed a deep sequencing-based method called Poly(A) Site Sequencing (PAS-Seq) for quantitatively profiling RNA polyadenylation at the transcriptome level. PAS-Seq not only accurately and comprehensively identifies poly(A) junctions in mRNAs and noncoding RNAs, but also provides quantitative information on the relative abundance of polyadenylated RNAs. PAS-Seq analyses of human and mouse transcriptomes showed that 40%–50% of all expressed genes produce alternatively polyadenylated mRNAs. Furthermore, our study detected evolutionarily conserved polyadenylation of histone mRNAs and revealed novel features of mitochondrial RNA polyadenylation. Finally, PAS-Seq analyses of mouse embryonic stem (ES) cells, neural stem/progenitor (NSP) cells, and neurons not only identified more poly(A) sites than what was found in the entire mouse EST database, but also detected significant changes in the global APA profile that lead to lengthening of 3′ untranslated regions (UTR) in many mRNAs during stem cell differentiation. Together, our PAS-Seq analyses revealed a complex landscape of RNA polyadenylation in mammalian cells and the dynamic regulation of APA during stem cell differentiation. PMID:21343387
Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer
NASA Astrophysics Data System (ADS)
Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad
2017-04-01
Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.
Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo
2016-01-01
MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848
Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie
2018-01-01
Abstract Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. PMID:29106630
Wen, Meiling; Jin, Ya; Manabe, Takashi; Chen, Shumin; Tan, Wen
2017-12-01
MS identification has long been used for PAGE-separated protein bands, but global and systematic quantitation utilizing MS after PAGE has remained rare and not been reported for native PAGE. Here we reported on a new method combining native PAGE, whole-gel slicing and quantitative LC-MS/MS, aiming at comparative analysis on not only abundance, but also structures and interactions of proteins. A pair of human plasma and serum samples were used as test samples and separated on a native PAGE gel. Six lanes of each sample were cut, each lane was further sliced into thirty-five 1.1 mm × 1.1 mm squares and all the squares were subjected to standardized procedures of in-gel digestion and quantitative LC-MS/MS. The results comprised 958 data rows that each contained abundance values of a protein detected in one square in eleven gel lanes (one plasma lane excluded). The data were evaluated to have satisfactory reproducibility of assignment and quantitation. Totally 315 proteins were assigned, with each protein assigned in 1-28 squares. The abundance distributions in the plasma and serum gel lanes were reconstructed for each protein, named as "native MS-electropherograms". Comparison of the electropherograms revealed significant plasma-versus-serum differences on 33 proteins in 87 squares (fold difference > 2 or < 0.5, p < 0.05). Many of the differences matched with accumulated knowledge on protein interactions and proteolysis involved in blood coagulation, complement and wound healing processes. We expect this method would be useful to provide more comprehensive information in comparative proteomic analysis, on both quantities and structures/interactions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng
2017-05-01
As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Cao, Di; Wang, Qing; Jin, Jing; Qiu, Maosong; Zhou, Lian; Zhou, Xinghong; Li, Hui; Zhao, Zhongxiang
2018-03-01
Ilex pubescens Hook et Arn mainly contains triterpenoids that possess antithrombotic, anti-inflammatory and analgesic effects. Quantitative and qualitative analyses of the triterpenoids in I. pubescens can be useful for determining the authenticity and quality of raw materials and guiding its clinical preparation. To establish a method for rapid and comprehensive analysis of triterpenoids in I. pubescens using ultra-high-performance liquid chromatography coupled to electrospray ionisation and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS), which will also be applied to evaluate the contents of nine triterpenoids among root, root heartwood and root bark of I. pubescens to judge the value of the root bark to avoid wastage. UPLC-ESI-QTOF-MS data from the extracts of I. pubescens in negative mode were analysed using Peakview and Masterview software that provided molecular weight, mass errors, isotope pattern fit and MS/MS fragments for the identification of triterpenoids. The quantification of nine investigated compounds of I. pubescens was accomplished using MultiQuant software. A total of 33 triterpenoids, five phenolic acids, two lignans and a flavonol were characterised in only 14 min. The total content of the nine compounds in the root bark was generally slightly higher than that of the root and root heartwood, which has not been reported before. The developed UPLC-ESI-QTOF-MS method was proven to be rapid and comprehensive for simultaneous qualitative and quantitative analyses of the characteristic triterpenoids in I. pubescens. The results may provide a basis for holistic quality control and metabolic studies of I. pubescens, as well as serve as a reference for the analysis of other Ilex plants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Mast, Fred D.; Ratushny, Alexander V.
2014-01-01
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336
Celińska, Ewelina; Olkowicz, Mariola; Grajek, Włodzimierz
2015-08-01
A world-wide effort is now being pursued towards the development of flavors and fragrances (F&F) production independently from traditional sources, as well as autonomously from depleting fossil fuel supplies. Biotechnological production of F&F by microbes has emerged as a vivid solution to the current market limitations. Amongst a wide variety of fragrant chemicals, 2-PE is of significant interest to both scientific and industrial community. Although the general overview of the 2-PE synthesis pathway is commonly known, involvement of particular molecular identities in this pathway has not been elucidated in Yarrowia lipolytica to date. The aim of this study was mapping molecular identities involved in 2-PE synthesis in Y. lipolytica. To acquire a comprehensive landscape of the proteins that are directly and indirectly involved in L-Phe degradation and 2-PE synthesis, we took advantage of comprehensibility and sensitivity of high-throughput LC-MS/MS-quantitative analysis. Amongst a number of proteins involved in amino acid turnover and the central carbon metabolism, enzymes involved in L-Phe conversion to 2-PE have been identified. Results on yeast-to-hyphae transition in relation to the character of the provided nitrogen source have been presented. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Flyvbjerg, Henrik; Mortensen, Kim I.
2015-06-01
With each new aspect of nature that becomes accessible to quantitative science, new needs arise for data analysis and mathematical modeling. The classical example is Tycho Brahe's accurate and comprehensive observations of planets, which made him hire Kepler for his mathematical skills to assist with the data analysis. We all learned what that lead to: Kepler's three laws of planetary motion, phenomenology in purely mathematical form. Newton built on this, and the scientific revolution was over, completed.
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
Vester, Diana; Rapp, Erdmann; Gade, Dörte; Genzel, Yvonne; Reichl, Udo
2009-06-01
Over the last years virus-host cell interactions were investigated in numerous studies. Viral strategies for evasion of innate immune response, inhibition of cellular protein synthesis and permission of viral RNA and protein production were disclosed. With quantitative proteome technology, comprehensive studies concerning the impact of viruses on the cellular machinery of their host cells at protein level are possible. Therefore, 2-D DIGE and nanoHPLC-nanoESI-MS/MS analysis were used to qualitatively and quantitatively determine the dynamic cellular proteome responses of two mammalian cell lines to human influenza A virus infection. A cell line used for vaccine production (MDCK) was compared with a human lung carcinoma cell line (A549) as a reference model. Analyzing 2-D gels of the proteomes of uninfected and influenza-infected host cells, 16 quantitatively altered protein spots (at least +/-1.7-fold change in relative abundance, p<0.001) were identified for both cell lines. Most significant changes were found for keratins, major components of the cytoskeleton system, and for Mx proteins, interferon-induced key components of the host cell defense. Time series analysis of infection processes allowed the identification of further proteins that are described to be involved in protein synthesis, signal transduction and apoptosis events. Most likely, these proteins are required for supporting functions during influenza viral life cycle or host cell stress response. Quantitative proteome-wide profiling of virus infection can provide insights into complexity and dynamics of virus-host cell interactions and may accelerate antiviral research and support optimization of vaccine manufacturing processes.
van Rossum, Peter S N; Fried, David V; Zhang, Lifei; Hofstetter, Wayne L; van Vulpen, Marco; Meijer, Gert J; Court, Laurence E; Lin, Steven H
2016-05-01
A reliable prediction of a pathologic complete response (pathCR) to chemoradiotherapy before surgery for esophageal cancer would enable investigators to study the feasibility and outcome of an organ-preserving strategy after chemoradiotherapy. So far no clinical parameters or diagnostic studies are able to accurately predict which patients will achieve a pathCR. The aim of this study was to determine whether subjective and quantitative assessment of baseline and postchemoradiation (18)F-FDG PET can improve the accuracy of predicting pathCR to preoperative chemoradiotherapy in esophageal cancer beyond clinical predictors. This retrospective study was approved by the institutional review board, and the need for written informed consent was waived. Clinical parameters along with subjective and quantitative parameters from baseline and postchemoradiation (18)F-FDG PET were derived from 217 esophageal adenocarcinoma patients who underwent chemoradiotherapy followed by surgery. The associations between these parameters and pathCR were studied in univariable and multivariable logistic regression analysis. Four prediction models were constructed and internally validated using bootstrapping to study the incremental predictive values of subjective assessment of (18)F-FDG PET, conventional quantitative metabolic features, and comprehensive (18)F-FDG PET texture/geometry features, respectively. The clinical benefit of (18)F-FDG PET was determined using decision-curve analysis. A pathCR was found in 59 (27%) patients. A clinical prediction model (corrected c-index, 0.67) was improved by adding (18)F-FDG PET-based subjective assessment of response (corrected c-index, 0.72). This latter model was slightly improved by the addition of 1 conventional quantitative metabolic feature only (i.e., postchemoradiation total lesion glycolysis; corrected c-index, 0.73), and even more by subsequently adding 4 comprehensive (18)F-FDG PET texture/geometry features (corrected c-index, 0.77). However, at a decision threshold of 0.9 or higher, representing a clinically relevant predictive value for pathCR at which one may be willing to omit surgery, there was no clear incremental value. Subjective and quantitative assessment of (18)F-FDG PET provides statistical incremental value for predicting pathCR after preoperative chemoradiotherapy in esophageal cancer. However, the discriminatory improvement beyond clinical predictors does not translate into a clinically relevant benefit that could change decision making. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
ERIC Educational Resources Information Center
Gonzales, Wilkinson Daniel Wong; Torres, Patrisha Lliane
2015-01-01
Several pressing issues in reading instruction have prompted educators world-wide to conduct researches on how to best improve reading comprehension skills. Although there is a wealth of reading comprehension researches done with EFL learners and native English speakers, there seems to be limited published studies conducted in the Philippine ESL…
Effect of Music on Reading Comprehension of Junior High School Students
ERIC Educational Resources Information Center
Anderson, Stacey A.; Fuller, Gerald B.
2010-01-01
This quantitative study was an investigation of the effect of lyrical music on reading comprehension by adolescents. Existing research has produced results that range from concluding such distraction may be detrimental to finding it could be helpful. The reading comprehension subtest of the Gates-MacGinitie Reading Tests, 4th edition (MacGinitie,…
2016-01-01
Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978
Syntactic processing as a marker for cognitive impairment in amyotrophic lateral sclerosis
Tsermentseli, Stella; Leigh, P. Nigel; Taylor, Lorna J.; Radunovic, Aleksandar; Catani, Marco; Goldstein, Laura H.
2016-01-01
Despite recent interest in cognitive changes in patients with amyotrophic lateral sclerosis (ALS), investigations of language function looking at the level of word, sentence and discourse processing are relatively scarce. Data were obtained from 26 patients with sporadic ALS and 26 healthy controls matched for age, education, gender, anxiety, depression and executive function performance. Standardized language tasks included confrontation naming, semantic access, and syntactic comprehension. Quantitative production analysis (QPA) was used to analyse connected speech samples of the Cookie Theft picture description task. Results showed that the ALS patients were impaired on standardized measures of grammatical comprehension and action/verb semantics. At the level of discourse, ALS patients were impaired on measures of syntactic complexity and fluency; however, the latter could be better explained by disease related factors. Discriminant analysis revealed that syntactic measures differentiated ALS patients from controls. In conclusion, patients with ALS exhibit deficits in receptive and expressive language on tasks of comprehension and connected speech production, respectively. Our findings suggest that syntactic processing deficits seem to be the predominant feature of language impairment in ALS and that these deficits can be detected by relatively simple language tests. PMID:26312952
Syntactic processing as a marker for cognitive impairment in amyotrophic lateral sclerosis.
Tsermentseli, Stella; Leigh, P Nigel; Taylor, Lorna J; Radunovic, Aleksandar; Catani, Marco; Goldstein, Laura H
2015-01-01
Despite recent interest in cognitive changes in patients with amyotrophic lateral sclerosis (ALS), investigations of language function looking at the level of word, sentence and discourse processing are relatively scarce. Data were obtained from 26 patients with sporadic ALS and 26 healthy controls matched for age, education, gender, anxiety, depression and executive function performance. Standardized language tasks included confrontation naming, semantic access, and syntactic comprehension. Quantitative production analysis (QPA) was used to analyse connected speech samples of the Cookie Theft picture description task. Results showed that the ALS patients were impaired on standardized measures of grammatical comprehension and action/verb semantics. At the level of discourse, ALS patients were impaired on measures of syntactic complexity and fluency; however, the latter could be better explained by disease related factors. Discriminant analysis revealed that syntactic measures differentiated ALS patients from controls. In conclusion, patients with ALS exhibit deficits in receptive and expressive language on tasks of comprehension and connected speech production, respectively. Our findings suggest that syntactic processing deficits seem to be the predominant feature of language impairment in ALS and that these deficits can be detected by relatively simple language tests.
Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M
2016-01-01
Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J
2015-06-06
There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.
ERIC Educational Resources Information Center
Reutzel, D. Ray; Smith, John A.; Fawson, Parker C.
2005-01-01
There are few research studies on the effects of teaching comprehension strategies to young children in the primary grades. Using a Dominant-Less Dominant Mixed Model design employing both qualitative and quantitative data collection, we evaluated two approaches for teaching comprehension strategies to 7- and 8-year-old children in four…
Machine Learning Meta-analysis of Large Metagenomic Datasets: Tools and Biological Insights.
Pasolli, Edoardo; Truong, Duy Tin; Malik, Faizan; Waldron, Levi; Segata, Nicola
2016-07-01
Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the "healthy" microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.
Sanchis, Yovana; Coscollà, Clara; Roca, Marta; Yusà, Vicent
2015-06-01
An analytical strategy including both the quantitative target analysis of 8 regulated primary aromatic amines (PAAs), as well as a comprehensive post-run target screening of 77 migrating substances, was developed for nylon utensils, using liquid chromatography-orbitrap-high resolution mass spectrometry (LC-HRMS) operating in full scan mode. The accurate mass data were acquired with a resolving power of 50,000 FWHM (scan speed, 2 Hz), and by alternating two acquisition events, ESI+ with and without fragmentation. The target method was validated after statistical optimization of the main ionization and fragmentation parameters. The quantitative method presented appropriate performance to be used in official monitoring with recoveries ranging from 78% to 112%, precision in terms of Relative Standard Deviation (RSD) was less than 15%, and the limits of quantification were between 2 and 2.5 µg kg(-1). For post-target screening, a customized theoretical database was built for food contact material migrants, including bisphenols, phthalates, and other amines. For identification purposes, accurate exact mass (<5 ppm) and some diagnostic ions including fragments were used. The strategy was applied to 10 real samples collected from different retailers in the Valencian Region (Spain) during 2014. Six out of eight target PAAs were detected in at least one sample in the target analysis. The most frequently detected compounds were 4,4'-methylenedianiline and aniline, with concentrations ranging from 2.4 to 19,715 µg kg(-1) and 2.5 to 283 µg kg(-1), respectively. Two phthalates were identified and confirmed in the post-run target screening analysis. Copyright © 2015 Elsevier B.V. All rights reserved.
Ranacher, Peter; Tzavella, Katerina
2014-05-27
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods.
Ranacher, Peter; Tzavella, Katerina
2014-01-01
In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods. PMID:27019646
Wang, Ting-Ting; Li, Jin-Mei; Zhou, Dong
2016-01-01
With great interest, we read the paper "Polymorphisms in IL-4/IL-13 pathway genes and glioma risk: an updated meta-analysis" (by Chen PQ et al.) [1], which has reached important conclusions about the relationship between polymorphisms in interleukin (IL)-4/IL-13 pathway genes and glioma risk. Through quantitative analysis, the meta-analysis found no association between IL-4/IL-13 pathway genetic polymorphisms and glioma risk (Chen et al. in Tumor Biol 36:121-127, 2015). The meta-analysis is the most comprehensive study of polymorphisms in the IL-4/IL-13 pathway and glioma risk. Nevertheless, some deficiencies still exist in this meta-analysis that we would like to raise.
Zhang, Ying; Tobias, Herbert J.; Brenna, J. Thomas
2014-01-01
Comprehensive two dimensional gas chromatography (GC×GC) provides greater separation space than conventional GC. Because of fast peak elution, a time of flight mass spectrometer (TOFMS) is the usual structure-specific detector of choice. The quantitative capabilities of a novel GC×GC fast quadrupole MS were investigated with electron ionization (EI), and CH4 or NH3 positive chemical ionization (PCI) for analysis of endogenous urinary steroids targeted in anti-doping tests. Average precisions for steroid quantitative analysis from replicate urine extractions were 6% (RSD) for EI and 8% for PCI-NH3. The average limits of detection (LOD) calculated by quantification ions for 12 target steroids spiked into steroid-free urine matrix (SFUM) were 2.6 ng mL−1 for EI, 1.3 ng mL−1 for PCI-CH4, and 0.3 ng mL−1 for PCI-NH3, all in mass scanning mode. The measured limits of quantification (LOQ) with full mass scan GC×GC-qMS were comparable with the LOQ values measured by one-dimensional GC-MS in single ion monitoring (SIM) mode. PCI-NH3 yields fewer fragments and greater (pseudo)molecular ion abundances than EI or PCI-CH4. These data show a benchtop GC×GC-qMS system has the sensitivity, specificity, and resolution to analyze urinary steroids at normal urine concentrations, and that PCI-NH3, not currently available on most GC×GC-TOFMS instruments, is of particular value for generation of structure-specific ions. PMID:22606686
2006-10-01
Gibbs, E. M., Fletterick, R. J., Day, Y. S. N., Myszka, D. G., and Rath, V. L. (2002) “Structure-activity analysis of the purine-binding site of human ...Rich, R. L., Day, Y. S. N., Morton, T. A., and Myszka, D. G., (2001) “High- resolution and high-throughput protocols for measuring drug/ human serum...entire text) 1. Attard, P., Images of nanobubbles on hydrophobic surfaces and their interactions. Phys. Rev. Lett., 2001. 87. 2. Ottino, J.M
Mast, Fred D; Ratushny, Alexander V; Aitchison, John D
2014-09-15
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.
The detection of large deletions or duplications in genomic DNA.
Armour, J A L; Barton, D E; Cockburn, D J; Taylor, G R
2002-11-01
While methods for the detection of point mutations and small insertions or deletions in genomic DNA are well established, the detection of larger (>100 bp) genomic duplications or deletions can be more difficult. Most mutation scanning methods use PCR as a first step, but the subsequent analyses are usually qualitative rather than quantitative. Gene dosage methods based on PCR need to be quantitative (i.e., they should report molar quantities of starting material) or semi-quantitative (i.e., they should report gene dosage relative to an internal standard). Without some sort of quantitation, heterozygous deletions and duplications may be overlooked and therefore be under-ascertained. Gene dosage methods provide the additional benefit of reporting allele drop-out in the PCR. This could impact on SNP surveys, where large-scale genotyping may miss null alleles. Here we review recent developments in techniques for the detection of this type of mutation and compare their relative strengths and weaknesses. We emphasize that comprehensive mutation analysis should include scanning for large insertions and deletions and duplications. Copyright 2002 Wiley-Liss, Inc.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Chan, Wei Ping; Wang, Jing-Yuan
2016-08-01
Recently, sludge attracted great interest as a potential feedstock in thermochemical conversion processes. However, compositions and thermal degradation behaviours of sludge were highly complex and distinctive compared to other traditional feedstock led to a need of fundamental research on sludge. Comprehensive characterisation of sludge specifically for thermochemical conversion was carried out for all existing Water Reclamation Plants in Singapore. In total, 14 sludge samples collected based on the type, plant, and batch categorisation. Existing characterisation methods for physical and chemical properties were analysed and reviewed using the collected samples. Qualitative similarities and quantitative variations of different sludge samples were identified and discussed. Oxidation of inorganic in sludge during ash forming analysis found to be causing significant deviations on proximate and ultimate analysis. Therefore, alternative parameters and comparison basis including Fixed Residues (FR), Inorganic Matters (IM) and Total Inorganics (TI) were proposed for better understanding on the thermochemical characteristics of sludge. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhang, Qibin; Monroe, Matthew E.; Schepmoes, Athena A.; Clauss, Therese R. W.; Gritsenko, Marina A.; Meng, Da; Petyuk, Vladislav A.; Smith, Richard D.; Metz, Thomas O.
2011-01-01
Non-enzymatic glycation of proteins sets the stage for formation of advanced glycation end-products and development of chronic complications of diabetes. In this report, we extended our previous methods on proteomics analysis of glycated proteins to comprehensively identify glycated proteins in control and diabetic human plasma and erythrocytes. Using immunodepletion, enrichment, and fractionation strategies, we identified 7749 unique glycated peptides, corresponding to 3742 unique glycated proteins. Semi-quantitative comparisons showed that glycation levels of a number of proteins were significantly increased in diabetes and that erythrocyte proteins were more extensively glycated than plasma proteins. A glycation motif analysis revealed that some amino acids were favored more than others in the protein primary structures in the vicinity of the glycation sites in both sample types. The glycated peptides and corresponding proteins reported here provide a foundation for potential identification of novel markers for diabetes, hyperglycemia, and diabetic complications in future studies. PMID:21612289
Clinical Pedodontics: An Approach Based on Comprehensive Care.
ERIC Educational Resources Information Center
And Others; Bennett, Carroll G.
1981-01-01
The University of Florida uses a comprehensive care system to teach clinical pedodontics. Several block clinics permit further experience with children. Details of the program are described, and quantitative results of patient treatment are compared with those of other clinical pedodontics programs. (MSE)
Ionas, Alin C; Ballesteros Gómez, Ana; Uchida, Natsuyo; Suzuki, Go; Kajiwara, Natsuko; Takata, Kyoko; Takigami, Hidetaka; Leonards, Pim E G; Covaci, Adrian
2015-10-01
The presence and levels of flame retardants (FRs), such as polybrominated diphenyl ethers (PBDEs) and organophosphate flame retardants (PFRs), was determined in textile home furnishings, such as carpets and curtains from stores in Belgium. A comprehensive characterisation of FRs in textile was done by ambient high resolution mass spectrometry (qualitative screening), gas chromatography-mass spectrometry (GC-MS) (quantitation), and environmental forensic microscopy (surface distribution). Ambient ionisation coupled to a time-of-flight (TOF) high resolution mass spectrometer (direct probe-TOF-MS) was investigated for the rapid screening of FRs. Direct probe-TOF-MS proved to be useful for a first screening step of textiles to detect FRs below the levels required to impart flame retardancy and to reduce, in this way, the number of samples for further quantitative analysis. Samples were analysed by GC-MS to confirm the results obtained by ambient mass spectrometry and to obtain quantitative information. The levels of PBDEs and PFRs were typically too low to impart flame retardancy. Only high levels of BDE-209 (11-18% by weight) were discovered and investigated in localised hotspots by employing forensic microscopy techniques. Most of the samples were made of polymeric materials known to be inherently flame retarded to some extent, so it is likely that other alternative and halogen-free FR treatments/solutions are preferred for the textiles on the Belgian market. Copyright © 2015 Elsevier Inc. All rights reserved.
Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie; Zhang, Gong
2018-01-04
Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Impact of specific language impairment and type of school on different language subsystems.
Puglisi, Marina Leite; Befi-Lopes, Debora Maria
2016-01-01
This study aimed to explore quantitative and qualitative effects of type of school and specific language impairment (SLI) on different language abilities. 204 Brazilian children aged from 4 to 6 years old participated in the study. Children were selected to form three groups: 1) 63 typically developing children studying in private schools (TDPri); 2) 102 typically developing children studying in state schools (TDSta); and 39 children with SLI studying in state schools (SLISta). All individuals were assessed regarding expressive vocabulary, number morphology and morphosyntactic comprehension. All language subsystems were vulnerable to both environmental (type of school) and biological (SLI) effects. The relationship between the three language measures was exactly the same to all groups: vocabulary growth correlated with age and with the development of morphological abilities and morphosyntactic comprehension. Children with SLI showed atypical errors in the comprehension test at the age of 4, but presented a pattern of errors that gradually resembled typical development. The effect of type of school was marked by quantitative differences, while the effect of SLI was characterised by both quantitative and qualitative differences.
Olokundun, Maxwell; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Falola, Hezekiah; Salau, Odunayo; Peter, Fred; Borishade, Taiye
2018-06-01
The article presented data on the effectiveness of entrepreneurship curriculum contents on university students' entrepreneurial interest and knowledge. The study focused on the perceptions of Nigerian university students. Emphasis was laid on the first four universities in Nigeria to offer a degree programme in entrepreneurship. The study adopted quantitative approach with a descriptive research design to establish trends related to the objective of the study. Survey was be used as quantitative research method. The population of this study included all students in the selected universities. Data was analyzed with the use of Statistical Package for Social Sciences (SPSS). Mean score was used as statistical tool of analysis. The field data set is made widely accessible to enable critical or a more comprehensive investigation.
Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram
2016-01-01
Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
Characterization of Low-Molecular-Weight Heparins by Strong Anion-Exchange Chromatography.
Sadowski, Radosław; Gadzała-Kopciuch, Renata; Kowalkowski, Tomasz; Widomski, Paweł; Jujeczka, Ludwik; Buszewski, Bogusław
2017-11-01
Currently, detailed structural characterization of low-molecular-weight heparin (LMWH) products is an analytical subject of great interest. In this work, we carried out a comprehensive structural analysis of LMWHs and applied a modified pharmacopeial method, as well as methods developed by other researchers, to the analysis of novel biosimilar LMWH products; and, for the first time, compared the qualitative and quantitative composition of commercially available drugs (enoxaparin, nadroparin, and dalteparin). For this purpose, we used strong anion-exchange (SAX) chromatography with spectrophotometric detection because this method is more helpful, easier, and faster than other separation techniques for the detailed disaccharide analysis of new LMWH drugs. In addition, we subjected the obtained results to statistical analysis (factor analysis, t-test, and Newman-Keuls post hoc test).
Investigation of variety resources and quantitative analysis on Heyin pomegranate in Xingyang City
NASA Astrophysics Data System (ADS)
Li, Wenzeng; Wang, Zhihong
2018-04-01
Various factors that should be considered in variety breeding of Heyin pomegranate, the hierarchical analysis is carried out through analytic hierarchy process (AHP) and its analytic result can be used to help fruit farmers make scientific decision on the variety breeding of pomegranate. In the six main Heyin pomegranate varieties, the ranking weight value of Tunisian soft-seeded pomegranate is 0.3105, which is No.1 in all pomegranate varieties and is obviously better than other varieties in comprehensive feature. It shows that, in the cultivation of pomegranate in Xingyang, the Tunisian soft-seeded pomegranate is the preferred variety for fruit farmers.
Classifying oxidative stress by F2-isoprostane levels across human diseases: A meta-analysis.
van 't Erve, Thomas J; Kadiiska, Maria B; London, Stephanie J; Mason, Ronald P
2017-08-01
The notion that oxidative stress plays a role in virtually every human disease and environmental exposure has become ingrained in everyday knowledge. However, mounting evidence regarding the lack of specificity of biomarkers traditionally used as indicators of oxidative stress in human disease and exposures now necessitates re-evaluation. To prioritize these re-evaluations, published literature was comprehensively analyzed in a meta-analysis to quantitatively classify the levels of systemic oxidative damage across human disease and in response to environmental exposures. In this meta-analysis, the F 2 -isoprostane, 8-iso-PGF 2α , was specifically chosen as the representative marker of oxidative damage. To combine published values across measurement methods and specimens, the standardized mean differences (Hedges' g) in 8-iso-PGF 2α levels between affected and control populations were calculated. The meta-analysis resulted in a classification of oxidative damage levels as measured by 8-iso-PGF 2α across 50 human health outcomes and exposures from 242 distinct publications. Relatively small increases in 8-iso-PGF 2α levels (g<0.8) were found in the following conditions: hypertension (g=0.4), metabolic syndrome (g=0.5), asthma (g=0.4), and tobacco smoking (g=0.7). In contrast, large increases in 8-iso-PGF 2α levels were observed in pathologies of the kidney, e.g., chronic renal insufficiency (g=1.9), obstructive sleep apnoea (g=1.1), and pre-eclampsia (g=1.1), as well as respiratory tract disorders, e.g., cystic fibrosis (g=2.3). In conclusion, we have established a quantitative classification for the level of 8-iso-PGF 2α generation in different human pathologies and exposures based on a comprehensive meta-analysis of published data. This analysis provides knowledge on the true involvement of oxidative damage across human health outcomes as well as utilizes past research to prioritize those conditions requiring further scrutiny on the mechanisms of biomarker generation. Copyright © 2017. Published by Elsevier B.V.
Readability Approaches: Implications for Turkey
ERIC Educational Resources Information Center
Ulusoy, Mustafa
2006-01-01
Finding the right fit between students' reading ability and textbooks is very important for comprehension. Readability studies aim to analyse texts to find the right fit between students and texts. In this literature review, readability studies are classified under quantitative, qualitative and combined quantitative-qualitative readability…
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
Global, quantitative and dynamic mapping of protein subcellular localization
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH
2016-01-01
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775
Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis
2015-11-13
CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.
van der Westhuizen, Rina; Ajam, Mariam; De Coning, Piet; Beens, Jan; de Villiers, André; Sandra, Pat
2011-07-15
Fully synthetic jet fuel (FSJF) produced via Fischer-Tropsch (FT) technology was recently approved by the international aviation fuel authorities. To receive approval, comparison of FSJF and crude-derived fuel and blends on their qualitative and quantitative hydrocarbon composition was of utmost importance. This was performed by comprehensive two-dimensional gas chromatography (GC×GC) in the reversed phase mode. The hydrocarbon composition of synthetic and crude-derived jet fuels is very similar and all compounds detected in the synthetic product are also present in crude-derived fuels. Quantitatively, the synthetic fuel consists of a higher degree of aliphatic branching with less than half the aromatic content of the crude-derived fuel. GC×GC analyses also indicated the presence of trace levels of hetero-atomic impurities in the crude-derived product that were absent in the synthetic product. While clay-treatment removed some of the impurities and improved the fuel stability, the crude-derived product still contained traces of cyclic and aromatic S-containing compounds afterwards. Lower level of aromatics and the absence of sulphur are some of the factors that contribute to the better fuel stability and environmental properties of the synthetic fuel. GC×GC was further applied for the analysis of products during Jet Fuel Thermal Oxidation Testing (JFTOT), which measures deposit formation of a fuel under simulated engine conditions. JFTOT showed the synthetic fuel to be much more stable than the crude-derived fuel. Copyright © 2011 Elsevier B.V. All rights reserved.
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
Simulation of UV atomic radiation for application in exhaust plume spectrometry
NASA Astrophysics Data System (ADS)
Wallace, T. L.; Powers, W. T.; Cooper, A. E.
1993-06-01
Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.
Decoding tumour phenotype by noninvasive imaging using a quantitative radiomics approach
Aerts, Hugo J. W. L.; Velazquez, Emmanuel Rios; Leijenaar, Ralph T. H.; Parmar, Chintan; Grossmann, Patrick; Cavalho, Sara; Bussink, Johan; Monshouwer, René; Haibe-Kains, Benjamin; Rietveld, Derek; Hoebers, Frank; Rietbergen, Michelle M.; Leemans, C. René; Dekker, Andre; Quackenbush, John; Gillies, Robert J.; Lambin, Philippe
2014-01-01
Human cancers exhibit strong phenotypic differences that can be visualized noninvasively by medical imaging. Radiomics refers to the comprehensive quantification of tumour phenotypes by applying a large number of quantitative image features. Here we present a radiomic analysis of 440 features quantifying tumour image intensity, shape and texture, which are extracted from computed tomography data of 1,019 patients with lung or head-and-neck cancer. We find that a large number of radiomic features have prognostic power in independent data sets of lung and head-and-neck cancer patients, many of which were not identified as significant before. Radiogenomics analysis reveals that a prognostic radiomic signature, capturing intratumour heterogeneity, is associated with underlying gene-expression patterns. These data suggest that radiomics identifies a general prognostic phenotype existing in both lung and head-and-neck cancer. This may have a clinical impact as imaging is routinely used in clinical practice, providing an unprecedented opportunity to improve decision-support in cancer treatment at low cost. PMID:24892406
Statistical image quantification toward optimal scan fusion and change quantification
NASA Astrophysics Data System (ADS)
Potesil, Vaclav; Zhou, Xiang Sean
2007-03-01
Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.
Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong
2016-01-01
As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.
Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong
2016-04-01
As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.
Mixed methods research - the best of both worlds?
van Griensven, Hubert; Moore, Ann P; Hall, Valerie
2014-10-01
There has been a bias towards quantitative research approaches within manual therapy, which may have resulted in a narrow understanding of manual therapy practice. The aim of this Masterclass is to make a contribution to the expansion of methodologies used in manual therapy enquiry by discussing mixed methods research (MMR), a methodology which utilises both qualitative and quantitative methods within a single study in order to provide more comprehensive insights. To review rationales for MMR, as well as some of the common design options and potential difficulties. The paper also discusses theoretical frameworks that have been used to underpin qualitative and quantitative research, and ongoing debates about the possibility of combining them. Complexities associated with health and manual therapy cannot always be investigated satisfactorily by using a single research method. Some issues require a more comprehensive understanding, which may be provided by combining the strengths of quantitative and qualitative methods in a mixed methods study. Copyright © 2014 Elsevier Ltd. All rights reserved.
Taoka, Masato; Nobe, Yuko; Hori, Masayuki; Takeuchi, Aiko; Masaki, Shunpei; Yamauchi, Yoshio; Nakayama, Hiroshi; Takahashi, Nobuhiro; Isobe, Toshiaki
2015-01-01
We present a liquid chromatography–mass spectrometry (LC-MS)-based method for comprehensive quantitative identification of post-transcriptional modifications (PTMs) of RNA. We incorporated an in vitro-transcribed, heavy isotope-labeled reference RNA into a sample RNA solution, digested the mixture with a number of RNases and detected the post-transcriptionally modified oligonucleotides quantitatively based on shifts in retention time and the MS signal in subsequent LC-MS. This allowed the determination and quantitation of all PTMs in Schizosaccharomyces pombe ribosomal (r)RNAs and generated the first complete PTM maps of eukaryotic rRNAs at single-nucleotide resolution. There were 122 modified sites, most of which appear to locate at the interface of ribosomal subunits where translation takes place. We also identified PTMs at specific locations in rRNAs that were altered in response to growth conditions of yeast cells, suggesting that the cells coordinately regulate the modification levels of RNA. PMID:26013808
Saravia-Pinilla, Martha H; Daza-Beltrán, Carolina; García-Acosta, Gabriel
2016-11-01
This article presents the results of a documentary-exploratory review of design methods and concepts associated with human and environmental factors, based on a qualitative-quantitative analysis of coincidences with the fundamentals of ergoecology and in line with sustainable dynamics, with a view to putting the principles of ergoecology into practice in product/service design and development. 61.6% of 696 documents found represent work on conceptual developments, while the remaining 38.4% refer to design methods. Searches were refined using Nvivo-10 software, and 101 documents were obtained about theoretical aspects while 17 focused on the application of methods, and these formed the analysis universe. The results show how little concern there is for working comprehensively on human and environmental aspects, and a trend toward segmentation of human and environmental aspects in the field of product/service design and development can be seen, at both concept and application/methodology levels. It was concluded from the above that comprehensive, simultaneous work is needed on human and environmental aspects, clarity and conceptual unity, in order to achieve sustainability in practical matters and ensure that ergoecology-compatible design methods are applied. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yoshida, Yoko; Miyata, Toshiyuki; Matsumoto, Masanori; Shirotani-Ikejima, Hiroko; Uchida, Yumiko; Ohyama, Yoshifumi; Kokubo, Tetsuro; Fujimura, Yoshihiro
2015-01-01
For thrombotic microangiopathies (TMAs), the diagnosis of atypical hemolytic uremic syndrome (aHUS) is made by ruling out Shiga toxin-producing Escherichia coli (STEC)-associated HUS and ADAMTS13 activity-deficient thrombotic thrombocytopenic purpura (TTP), often using the exclusion criteria for secondary TMAs. Nowadays, assays for ADAMTS13 activity and evaluation for STEC infection can be performed within a few hours. However, a confident diagnosis of aHUS often requires comprehensive gene analysis of the alternative complement activation pathway, which usually takes at least several weeks. However, predisposing genetic abnormalities are only identified in approximately 70% of aHUS. To facilitate the diagnosis of complement-mediated aHUS, we describe a quantitative hemolytic assay using sheep red blood cells (RBCs) and human citrated plasma, spiked with or without a novel inhibitory anti-complement factor H (CFH) monoclonal antibody. Among 45 aHUS patients in Japan, 24% (11/45) had moderate-to-severe (≥50%) hemolysis, whereas the remaining 76% (34/45) patients had mild or no hemolysis (<50%). The former group is largely attributed to CFH-related abnormalities, and the latter group has C3-p.I1157T mutations (16/34), which were identified by restriction fragment length polymorphism (RFLP) analysis. Thus, a quantitative hemolytic assay coupled with RFLP analysis enabled the early diagnosis of complement-mediated aHUS in 60% (27/45) of patients in Japan within a week of presentation. We hypothesize that this novel quantitative hemolytic assay would be more useful in a Caucasian population, who may have a higher proportion of CFH mutations than Japanese patients. PMID:25951460
Barlow, Pepita; McKee, Martin; Basu, Sanjay; Stuckler, David
2017-03-08
Regional trade agreements are major international policy instruments that shape macro-economic and political systems. There is widespread debate as to whether and how these agreements pose risks to public health. Here we perform a comprehensive systematic review of quantitative studies of the health impact of trade and investment agreements. We identified studies from searches in PubMed, Web of Science, EMBASE, and Global Health Online. Research articles were eligible for inclusion if they were quantitative studies of the health impacts of trade and investment agreements or policy. We systematically reviewed study findings, evaluated quality using the Quality Assessment Tool from the Effective Public Health Practice Project, and performed network citation analysis to study disciplinary siloes. Seventeen quantitative studies met our inclusion criteria. There was consistent evidence that implementing trade agreements was associated with increased consumption of processed foods and sugar-sweetened beverages. Granting import licenses for patented drugs was associated with increased access to pharmaceuticals. Implementing trade agreements and associated policies was also correlated with higher cardiovascular disease incidence and higher Body Mass Index (BMI), whilst correlations with tobacco consumption, under-five mortality, maternal mortality, and life expectancy were inconclusive. Overall, the quality of studies is weak or moderately weak, and co-citation analysis revealed a relative isolation of public health from economics. We identified limitations in existing studies which preclude definitive conclusions of the health impacts of regional trade and investment agreements. Few address unobserved confounding, and many possible consequences and mechanisms linking trade and investment agreements to health remain poorly understood. Results from our co-citation analysis suggest scope for greater interdisciplinary collaboration. Notwithstanding these limitations, our results find evidence that trade agreements pose some significant health risks. Health protections in trade and investment treaties may mitigate these impacts.
Fu, Wei; Xie, Wen; Zhang, Zhuo; Wang, Shaoli; Wu, Qingjun; Liu, Yong; Zhou, Xiaomao; Zhou, Xuguo; Zhang, Youjun
2013-01-01
Abstract: Quantitative real-time PCR (qRT-PCR), a primary tool in gene expression analysis, requires an appropriate normalization strategy to control for variation among samples. The best option is to compare the mRNA level of a target gene with that of reference gene(s) whose expression level is stable across various experimental conditions. In this study, expression profiles of eight candidate reference genes from the diamondback moth, Plutella xylostella, were evaluated under diverse experimental conditions. RefFinder, a web-based analysis tool, integrates four major computational programs including geNorm, Normfinder, BestKeeper, and the comparative ΔCt method to comprehensively rank the tested candidate genes. Elongation factor 1 (EF1) was the most suited reference gene for the biotic factors (development stage, tissue, and strain). In contrast, although appropriate reference gene(s) do exist for several abiotic factors (temperature, photoperiod, insecticide, and mechanical injury), we were not able to identify a single universal reference gene. Nevertheless, a suite of candidate reference genes were specifically recommended for selected experimental conditions. Our finding is the first step toward establishing a standardized qRT-PCR analysis of this agriculturally important insect pest. PMID:23983612
Glosses, Comprehension, and Strategy Use
ERIC Educational Resources Information Center
Ko, Myong Hee
2005-01-01
This study, using both qualitative and quantitative measures, investigates how different types of gloss conditions affect Korean college students' reading comprehension. One hundred and six undergraduates at a university in Korea participated in the study. Twelve were assigned to think aloud, and the rest (ninety-four) took part in the main study…
The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension
ERIC Educational Resources Information Center
Ertem, Ihsan Seyit
2010-01-01
This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…
Barthélémy, Daniel; Caraglio, Yves
2007-01-01
Background and Aims The architecture of a plant depends on the nature and relative arrangement of each of its parts; it is, at any given time, the expression of an equilibrium between endogenous growth processes and exogenous constraints exerted by the environment. The aim of architectural analysis is, by means of observation and sometimes experimentation, to identify and understand these endogenous processes and to separate them from the plasticity of their expression resulting from external influences. Scope Using the identification of several morphological criteria and considering the plant as a whole, from germination to death, architectural analysis is essentially a detailed, multilevel, comprehensive and dynamic approach to plant development. Despite their recent origin, architectural concepts and analysis methods provide a powerful tool for studying plant form and ontogeny. Completed by precise morphological observations and appropriated quantitative methods of analysis, recent researches in this field have greatly increased our understanding of plant structure and development and have led to the establishment of a real conceptual and methodological framework for plant form and structure analysis and representation. This paper is a summarized update of current knowledge on plant architecture and morphology; its implication and possible role in various aspects of modern plant biology is also discussed. PMID:17218346
Multifractal spectrum and lacunarity as measures of complexity of osseointegration.
de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko
2016-07-01
The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.
A Simple and Rapid UPLC-PDA Method for Quality Control of Nardostachys jatamansi.
Zhang, Weize; Nan, Guo; Wu, Hong-Hua; Jiang, Miaomiao; Li, Tian-Xiang; Wang, Meng; Gao, Xiu-Mei; Zhu, Yan; Song, Yun Seon; Wang, Jiaming; Xu, Yan-Tong
2018-05-01
Nardostachys jatamansi is a well-documented herbal agent used to treat digestive and neuropsychiatric disorders in oriental medicinal systems. However, few simple, rapid, and comprehensive methods were reported for quality assessment and control of N. jatamansi . Herein, a UPLC with photodiode array detection method was developed for both fingerprint investigation of N. jatamansi and simultaneous quantitative analysis of the six serotonin transporter modulatory constituents in N. jatamansi . For chromatographic fingerprinting, 24 common peaks were selected as characteristic peaks to assess the consistency of N. jatamansi samples from different retail sources. Six of the common peaks (5, 7, 12: , and 16: - 18: ) were identified as desoxo-narchinol A, buddleoside, isonardosinone, nardosinone, kanshone H, and (-)-aristolone, respectively, by phytochemical investigation. Five of the six compounds significantly either enhanced or inhibited serotonin transporter activity, while (-)-aristolone (18: ) didn't show any serotonin transporter activity. In quantitative analysis, the six compounds showed good linearity ( r > 0.999) within test ranges. The precision, expressed as relative standard deviation, was in the range of 0.25 - 2.77%, and the recovery of the method was in the range of 92 - 105%. The UPLC-photodiode array detection-based fingerprint analysis and quantitative methods reported here could be used for routine quality control of N. jatamansi . Georg Thieme Verlag KG Stuttgart · New York.
Banerjee, Imon; Malladi, Sadhika; Lee, Daniela; Depeursinge, Adrien; Telli, Melinda; Lipson, Jafi; Golden, Daniel; Rubin, Daniel L
2018-01-01
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is sensitive but not specific to determining treatment response in early stage triple-negative breast cancer (TNBC) patients. We propose an efficient computerized technique for assessing treatment response, specifically the residual tumor (RT) status and pathological complete response (pCR), in response to neoadjuvant chemotherapy. The proposed approach is based on Riesz wavelet analysis of pharmacokinetic maps derived from noninvasive DCE-MRI scans, obtained before and after treatment. We compared the performance of Riesz features with the traditional gray level co-occurrence matrices and a comprehensive characterization of the lesion that includes a wide range of quantitative features (e.g., shape and boundary). We investigated a set of predictive models ([Formula: see text]) incorporating distinct combinations of quantitative characterizations and statistical models at different time points of the treatment and some area under the receiver operating characteristic curve (AUC) values we reported are above 0.8. The most efficient models are based on first-order statistics and Riesz wavelets, which predicted RT with an AUC value of 0.85 and pCR with an AUC value of 0.83, improving results reported in a previous study by [Formula: see text]. Our findings suggest that Riesz texture analysis of TNBC lesions can be considered a potential framework for optimizing TNBC patient care.
Reddy, Panga Jaipal; Sinha, Sneha; Ray, Sandipan; Sathe, Gajanan J.; Chatterjee, Aditi; Prasad, T. S. Keshava; Dhali, Snigdha; Srikanth, Rapole; Panda, Dulal; Srivastava, Sanjeeva
2015-01-01
Curcumin is a natural dietary compound with antimicrobial activity against various gram positive and negative bacteria. This study aims to investigate the proteome level alterations in Bacillus subtilis due to curcumin treatment and identification of its molecular/cellular targets to understand the mechanism of action. We have performed a comprehensive proteomic analysis of B. subtilis AH75 strain at different time intervals of curcumin treatment (20, 60 and 120 min after the drug exposure, three replicates) to compare the protein expression profiles using two complementary quantitative proteomic techniques, 2D-DIGE and iTRAQ. To the best of our knowledge, this is the first comprehensive longitudinal investigation describing the effect of curcumin treatment on B. subtilis proteome. The proteomics analysis revealed several interesting targets such UDP-N-acetylglucosamine 1-carboxyvinyltransferase 1, putative septation protein SpoVG and ATP-dependent Clp protease proteolytic subunit. Further, in silico pathway analysis using DAVID and KOBAS has revealed modulation of pathways related to the fatty acid metabolism and cell wall synthesis, which are crucial for cell viability. Our findings revealed that curcumin treatment lead to inhibition of the cell wall and fatty acid synthesis in addition to differential expression of many crucial proteins involved in modulation of bacterial metabolism. Findings obtained from proteomics analysis were further validated using 5-cyano-2,3-ditolyl tetrazolium chloride (CTC) assay for respiratory activity, resazurin assay for metabolic activity and membrane integrity assay by potassium and inorganic phosphate leakage measurement. The gene expression analysis of selected cell wall biosynthesis enzymes has strengthened the proteomics findings and indicated the major effect of curcumin on cell division. PMID:25874956
Reddy, Panga Jaipal; Sinha, Sneha; Ray, Sandipan; Sathe, Gajanan J; Chatterjee, Aditi; Prasad, T S Keshava; Dhali, Snigdha; Srikanth, Rapole; Panda, Dulal; Srivastava, Sanjeeva
2015-01-01
Curcumin is a natural dietary compound with antimicrobial activity against various gram positive and negative bacteria. This study aims to investigate the proteome level alterations in Bacillus subtilis due to curcumin treatment and identification of its molecular/cellular targets to understand the mechanism of action. We have performed a comprehensive proteomic analysis of B. subtilis AH75 strain at different time intervals of curcumin treatment (20, 60 and 120 min after the drug exposure, three replicates) to compare the protein expression profiles using two complementary quantitative proteomic techniques, 2D-DIGE and iTRAQ. To the best of our knowledge, this is the first comprehensive longitudinal investigation describing the effect of curcumin treatment on B. subtilis proteome. The proteomics analysis revealed several interesting targets such UDP-N-acetylglucosamine 1-carboxyvinyltransferase 1, putative septation protein SpoVG and ATP-dependent Clp protease proteolytic subunit. Further, in silico pathway analysis using DAVID and KOBAS has revealed modulation of pathways related to the fatty acid metabolism and cell wall synthesis, which are crucial for cell viability. Our findings revealed that curcumin treatment lead to inhibition of the cell wall and fatty acid synthesis in addition to differential expression of many crucial proteins involved in modulation of bacterial metabolism. Findings obtained from proteomics analysis were further validated using 5-cyano-2,3-ditolyl tetrazolium chloride (CTC) assay for respiratory activity, resazurin assay for metabolic activity and membrane integrity assay by potassium and inorganic phosphate leakage measurement. The gene expression analysis of selected cell wall biosynthesis enzymes has strengthened the proteomics findings and indicated the major effect of curcumin on cell division.
Quantitative Assessment of Cancer Risk from Exposure to Diesel Engine Emissions
Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. This model accounted for rat-hum...
Developmental toxicity is a relevant endpoint for the comprehensive assessment of human health risk from chemical exposure. However, animal developmental toxicity studies remain unavailable for many environmental contaminants due to the complexity and cost of these types of analy...
Schenker, Victoria J.; Petrill, Stephen A.
2015-01-01
This study investigated the genetic and environmental influences on observed associations between listening comprehension, reading motivation, and reading comprehension. Univariate and multivariate quantitative genetic models were conducted in a sample of 284 pairs of twins at a mean age of 9.81 years. Genetic and nonshared environmental factors accounted for statistically significant variance in listening and reading comprehension, and nonshared environmental factors accounted for variance in reading motivation. Furthermore, listening comprehension demonstrated unique genetic and nonshared environmental influences but also had overlapping genetic influences with reading comprehension. Reading motivation and reading comprehension each had unique and overlapping nonshared environmental contributions. Therefore, listening comprehension appears to be related to reading primarily due to genetic factors whereas motivation appears to affect reading via child-specific, nonshared environmental effects. PMID:26321677
Schenker, Victoria J; Petrill, Stephen A
2015-01-01
This study investigated the genetic and environmental influences on observed associations between listening comprehension, reading motivation, and reading comprehension. Univariate and multivariate quantitative genetic models were conducted in a sample of 284 pairs of twins at a mean age of 9.81 years. Genetic and nonshared environmental factors accounted for statistically significant variance in listening and reading comprehension, and nonshared environmental factors accounted for variance in reading motivation. Furthermore, listening comprehension demonstrated unique genetic and nonshared environmental influences but also had overlapping genetic influences with reading comprehension. Reading motivation and reading comprehension each had unique and overlapping nonshared environmental contributions. Therefore, listening comprehension appears to be related to reading primarily due to genetic factors whereas motivation appears to affect reading via child-specific, nonshared environmental effects. Copyright © 2015 Elsevier Inc. All rights reserved.
Ash, Tayla; Agaronov, Alen; Young, Ta'Loria; Aftosmes-Tobio, Alyssa; Davison, Kirsten K
2017-08-24
A wide range of interventions has been implemented and tested to prevent obesity in children. Given parents' influence and control over children's energy-balance behaviors, including diet, physical activity, media use, and sleep, family interventions are a key strategy in this effort. The objective of this study was to profile the field of recent family-based childhood obesity prevention interventions by employing systematic review and quantitative content analysis methods to identify gaps in the knowledge base. Using a comprehensive search strategy, we searched the PubMed, PsycIFO, and CINAHL databases to identify eligible interventions aimed at preventing childhood obesity with an active family component published between 2008 and 2015. Characteristics of study design, behavioral domains targeted, and sample demographics were extracted from eligible articles using a comprehensive codebook. More than 90% of the 119 eligible interventions were based in the United States, Europe, or Australia. Most interventions targeted children 2-5 years of age (43%) or 6-10 years of age (35%), with few studies targeting the prenatal period (8%) or children 14-17 years of age (7%). The home (28%), primary health care (27%), and community (33%) were the most common intervention settings. Diet (90%) and physical activity (82%) were more frequently targeted in interventions than media use (55%) and sleep (20%). Only 16% of interventions targeted all four behavioral domains. In addition to studies in developing countries, racial minorities and non-traditional families were also underrepresented. Hispanic/Latino and families of low socioeconomic status were highly represented. The limited number of interventions targeting diverse populations and obesity risk behaviors beyond diet and physical activity inhibit the development of comprehensive, tailored interventions. To ensure a broad evidence base, more interventions implemented in developing countries and targeting racial minorities, children at both ends of the age spectrum, and media and sleep behaviors would be beneficial. This study can help inform future decision-making around the design and funding of family-based interventions to prevent childhood obesity.
Roca, M; Leon, N; Pastor, A; Yusà, V
2014-12-29
In this study we propose an analytical strategy that combines a target approach for the quantitative analysis of contemporary pesticide metabolites with a comprehensive post-target screening for the identification of biomarkers of exposure to environmental contaminants in urine using liquid chromatography coupled to high-resolution mass spectrometry (LC–HRMS). The quantitative method for the target analysis of 29 urinary metabolites of organophosphate (OP) insecticides, synthetic pyrethroids, herbicides and fungicides was validated after a previous statistical optimization of the main factors governing the ion source ionization and a fragmentation study using the high energy collision dissociation (HCD) cell. The full scan accurate mass data were acquired with a resolving power of 50,000 FWHM (scan speed, 2 Hz), in both ESI+ and ESI− modes, and with and without HCD-fragmentation. The method – LOQ was lower than 3.2 μg L−1 for the majority of the analytes. For post-target screening a customized theoretical database was built, for the identification of 60 metabolites including pesticides, PAHs, phenols, and other metabolites of environmental pollutants. For identification purposes, accurate exact mass with less than 5 ppm, and diagnostic ions including isotopes and/or fragments were used. The analytical strategy was applied to 20 urine sample collected from children living in Valencia Region. Eleven target metabolites were detected with concentrations ranging from 1.18 to 131 μg L−1. Likewise, several compounds were tentatively identified in the post-target analysis belonging to the families of phthalates, phenols and parabenes. The proposed strategy is suitable for the determination of target pesticide biomarkers in urine in the framework of biomonitoring studies, and appropriate for the identification of other non-target metabolites.
Yang, Yan-Mei; Lin, Li; Lu, You-Yuan; Ma, Xiao-Hui; Jin, Ling; Zhu, Tian-Tian
2016-03-01
The study is aimed to analyze the commercial specifications and grades of wild and cultivated Gentianae Macrophllae Radix based on multi-indicative constituents. The seven kinds of main chemical components containing in Gentianae Macrophyllae Radix were determined by UPLC, and then the quality levels of chemical component of Gentianae Macrophyllae Radix were clustered and classified by modern statistical methods (canonical correspondence analysis, Fisher discriminant analysis and so on). The quality indices were selected and their correlations were analyzed. Lastly, comprehensively quantitative grade division for quality under different commodity-specifications and different grades of same commodity-specifications of wild and planting were divided. The results provide a basis for a reasonable division of specification and grade of the commodity of Gentianae Macrophyllae Radix. The range of quality evaluation of main index components (gentiopicrin, loganin acid and swertiamarin) was proposed, and the Herbal Quality Index (HQI) was introduced. The rank discriminant function was established based on the quality by Fisher discriminant analysis. According to the analysis, the quality of wild and cultivated Luobojiao, one of the commercial specification of Gentianae Macrophyllae Radix was the best, Mahuajiao, the other commercial specification, was average , Xiaoqinjiao was inferior. Among grades, the quality of first-class cultivated Luobojiao was the worst, of second class secondary, and the third class the best; The quality of the first-class of wild Luobojiao was secondary, and the second-class the best; The quality of the second-class of Mahuajiao was secondary, and the first-class was the best; the quality of first-class Xiaoqinjiao was secondary, and the second-class was the better one between the two grades, but not obvious significantly. The method provides a new idea and method for evaluation of comprehensively quantitative on the quality of Gentianae Macrophyllae Radix. Copyright© by the Chinese Pharmaceutical Association.
Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David
2013-08-01
A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.
Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan
2016-04-13
Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish.
Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan
2016-01-01
Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish. PMID:27071722
Late paleozoic fusulinoidean gigantism driven by atmospheric hyperoxia.
Payne, Jonathan L; Groves, John R; Jost, Adam B; Nguyen, Thienan; Moffitt, Sarah E; Hill, Tessa M; Skotheim, Jan M
2012-09-01
Atmospheric hyperoxia, with pO(2) in excess of 30%, has long been hypothesized to account for late Paleozoic (360-250 million years ago) gigantism in numerous higher taxa. However, this hypothesis has not been evaluated statistically because comprehensive size data have not been compiled previously at sufficient temporal resolution to permit quantitative analysis. In this study, we test the hyperoxia-gigantism hypothesis by examining the fossil record of fusulinoidean foraminifers, a dramatic example of protistan gigantism with some individuals exceeding 10 cm in length and exceeding their relatives by six orders of magnitude in biovolume. We assembled and examined comprehensive regional and global, species-level datasets containing 270 and 1823 species, respectively. A statistical model of size evolution forced by atmospheric pO(2) is conclusively favored over alternative models based on random walks or a constant tendency toward size increase. Moreover, the ratios of volume to surface area in the largest fusulinoideans are consistent in magnitude and trend with a mathematical model based on oxygen transport limitation. We further validate the hyperoxia-gigantism model through an examination of modern foraminiferal species living along a measured gradient in oxygen concentration. These findings provide the first quantitative confirmation of a direct connection between Paleozoic gigantism and atmospheric hyperoxia. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
Hansmeier, Nicole; Buttigieg, Josef; Kumar, Pankaj; Pelle, Shaneen; Choi, Kyoo Yoon; Kopriva, David; Chao, Tzu-Chiao
2018-01-05
Atherosclerosis is a chronic inflammatory disease with complex pathobiology and one of the most common causes of cardiovascular events. The process is characterized by complex vascular remodeling processes that require the actions of numerous proteins. The composition of atherosclerotic plaque is increasingly recognized as a major factor governing the occurrence of cardiovascular or neurological symptoms. To gain deeper insights into the composition of atherosclerotic plaques, we created quantitative proteome profiles of advanced plaque tissues of six male patients undergoing carotid endarterectomy for stroke prevention. Using a quantitative, data-independent proteome approach, we identified 4181 proteins with an average protein coverage of 45%. An analysis of the quantitative composition of the tissue revealed key players of vascular remodeling processes. Moreover, compared with proximal arterial tissue, 20 proteins in mature plaques were enriched, whereas 52 proteins were found in lower quantities. Among the proteins with increased abundance were prominent extracellular matrix proteins such as biglycan and lumican, whereas cytoskeletal markers for contractile smooth muscle cells (SMCs) were decreased. Taken together, this study provides the most comprehensive quantitative assessment of mature human plaque tissue to date, which indicates a central role of SMCs in the structure of advanced atherosclerotic plaques.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... research to explore issues of quantitative benefit information. They all described the collection of data... research will involve quantitative assessment of the comprehension of important information in the document... of experiences and varying degrees of satisfaction with information currently provided at the time...
da Silva, Simone Albino; Baitelo, Tamara Cristina; Fracolli, Lislaine Aparecida
2015-01-01
to evaluate the attributes of primary health care as for access; longitudinality; comprehensiveness; coordination; family counseling and community counseling in the Family Health Strategy, triangulating and comparing the views of stakeholders involved in the care process. evaluative research with a quantitative approach and cross-sectional design. Data collected using the Primary Care Assessment Tool for interviews with 527 adult clients, 34 health professionals, and 330 parents of children up to two years old, related to 33 family health teams, in eleven municipalities. Analysis conducted in the Statistical Package for Social Sciences software, with a confidence interval of 95% and error of 0.1. the three groups assessed the first contact access - accessibility with low scores. Professionals evaluated with a high score the other attributes. Clients assigned low score evaluations for the attributes: community counseling; family counseling; comprehensiveness - services rendered; comprehensiveness - available services. the quality of performance self-reported by the professionals of the Family Health Strategy is not perceived or valued by clients, and the actions and services may have been developed inappropriately or insufficiently to be apprehended by the experience of clients.
Shilling, Val; Morris, Christopher; Thompson-Coon, Jo; Ukoumunne, Obioha; Rogers, Morwenna; Logan, Stuart
2013-07-01
To review the qualitative and quantitative evidence of the benefits of peer support for parents of children with disabling conditions in the context of health, well-being, impact on family, and economic and service implications. We comprehensively searched multiple databases. Eligible studies evaluated parent-to-parent support and reported on the psychological health and experience of giving or receiving support. There were no limits on the child's condition, study design, language, date, or setting. We sought to aggregate quantitative data; findings of qualitative studies were combined using thematic analysis. Qualitative and quantitative data were brought together in a narrative synthesis. Seventeen papers were included: nine qualitative studies, seven quantitative studies, and one mixed-methods evaluation. Four themes were identified from qualitative studies: (1) shared social identity, (2) learning from the experiences of others, (3) personal growth, and (4) supporting others. Some quantitative studies reported a positive effect of peer support on psychological health and other outcomes; however, this was not consistently confirmed. It was not possible to aggregate data across studies. No costing data were identified. Qualitative studies strongly suggest that parents perceive benefit from peer support programmes, an effect seen across different types of support and conditions. However, quantitative studies provide inconsistent evidence of positive effects. Further research should explore whether this dissonance is substantive or an artefact of how outcomes have been measured. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.
Gentili, Alessandra; Caretti, Fulvia
2011-02-04
The feasibility of using reversed-phase liquid chromatography/diode array/tandem mass spectrometry (LC-DAD-MS/MS) for a rapid and comprehensive profiling of fat soluble vitamins and pigments in some foods of plant origin (maize flour, green and golden kiwi) was evaluated. The instrumental approach was planned for obtaining two main outcomes within the same chromatographic run: (i) the quantitative analysis of ten target analytes, whose standards are commercially available; (ii) the screening of pigments occurring in the selected matrices. The quantitative analysis was performed simultaneously for four carotenoids (lutein, zeaxanthin, β-cryptoxanthin, and β-carotene) and six compounds with fat-soluble activity (α-tocopherol, δ-tocopherol, γ-tocopherol, ergocalciferol, phylloquinone and menaquinone-4), separated on a C30 reversed-phase column and detected by atmospheric pressure chemical ionization (APCI) tandem mass spectrometry, operating in Selected Reaction Monitoring (SRM) mode. Extraction procedure was based on matrix solid-phase dispersion with recoveries of all compounds under study exceeding 78 and 60% from maize flour and kiwi, respectively. The method intra-day precision ranged between 3 and 7%, while the inter-day one was below 12%. The mild isolation conditions precluded artefacts creation, such as cis-isomerization phenomena for carotenoids. During the quantitative LC-SRM determination of the ten target analytes, the identification power of the diode array detector joined to that of the triple quadrupole (QqQ) allowed the tentatively identification of several pigments (chlorophylls and carotenoids), without the aid of standards, on the basis of: (i) the UV-vis spectra recorded in the range of 200-700nm; (ii) the expected retention time; (iii) the two SRM transitions, chosen for the target carotenoids but also common to many of isomeric carotenoids occurring in the selected foods. Copyright © 2010 Elsevier B.V. All rights reserved.
Proteomic analysis of ligamentum flavum from patients with lumbar spinal stenosis.
Kamita, Masahiro; Mori, Taiki; Sakai, Yoshihito; Ito, Sadayuki; Gomi, Masahiro; Miyamoto, Yuko; Harada, Atsushi; Niida, Shumpei; Yamada, Tesshi; Watanabe, Ken; Ono, Masaya
2015-05-01
Lumbar spinal stenosis (LSS) is a syndromic degenerative spinal disease and is characterized by spinal canal narrowing with subsequent neural compression causing gait disturbances. Although LSS is a major age-related musculoskeletal disease that causes large decreases in the daily living activities of the elderly, its molecular pathology has not been investigated using proteomics. Thus, we used several proteomic technologies to analyze the ligamentum flavum (LF) of individuals with LSS. Using comprehensive proteomics with strong cation exchange fractionation, we detected 1288 proteins in these LF samples. A GO analysis of the comprehensive proteome revealed that more than 30% of the identified proteins were extracellular. Next, we used 2D image converted analysis of LC/MS to compare LF obtained from individuals with LSS to that obtained from individuals with disc herniation (nondegenerative control). We detected 64 781 MS peaks and identified 1675 differentially expressed peptides derived from 286 proteins. We verified four differentially expressed proteins (fibronectin, serine protease HTRA1, tenascin, and asporin) by quantitative proteomics using SRM/MRM. The present proteomic study is the first to identify proteins from degenerated and hypertrophied LF in LSS, which will help in studying LSS. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
HIV Rapid Testing in a VA Emergency Department Setting: Cost Analysis at 5 Years.
Knapp, Herschel; Chan, Kee
2015-07-01
To conduct a comprehensive cost-minimization analysis to comprehend the financial attributes of the first 5 years of an implementation wherein emergency department (ED) registered nurses administered HIV oral rapid tests to patients. A health science research implementation team coordinated with ED stakeholders and staff to provide training, implementation guidelines, and support to launch ED registered nurse-administered HIV oral rapid testing. Deidentified quantitative data were gathered from the electronic medical records detailing quarterly HIV rapid test rates in the ED setting spanning the first 5 years. Comprehensive cost analyses were conducted to evaluate the financial impact of this implementation. At 5 years, a total of 2,620 tests were conducted with a quarterly mean of 131 ± 81. Despite quarterly variability in testing rates, regression analysis revealed an average increase of 3.58 tests per quarter. Over the course of this implementation, Veterans Health Administration policy transitioned from written to verbal consent for HIV testing, serving to reduce the time and cost(s) associated with the testing process. Our data indicated salient health outcome benefits for patients with respect to the potential for earlier detection, and associated long-run cost savings. Copyright © 2015. Published by Elsevier Inc.
Optimizing separations in online comprehensive two‐dimensional liquid chromatography
Gargano, Andrea F.G.; Schoenmakers, Peter J.
2017-01-01
Abstract Online comprehensive two‐dimensional liquid chromatography has become an attractive option for the analysis of complex nonvolatile samples found in various fields (e.g. environmental studies, food, life, and polymer sciences). Two‐dimensional liquid chromatography complements the highly popular hyphenated systems that combine liquid chromatography with mass spectrometry. Two‐dimensional liquid chromatography is also applied to the analysis of samples that are not compatible with mass spectrometry (e.g. high‐molecular‐weight polymers), providing important information on the distribution of the sample components along chemical dimensions (molecular weight, charge, lipophilicity, stereochemistry, etc.). Also, in comparison with conventional one‐dimensional liquid chromatography, two‐dimensional liquid chromatography provides a greater separation power (peak capacity). Because of the additional selectivity and higher peak capacity, the combination of two‐dimensional liquid chromatography with mass spectrometry allows for simpler mixtures of compounds to be introduced in the ion source at any given time, improving quantitative analysis by reducing matrix effects. In this review, we summarize the rationale and principles of two‐dimensional liquid chromatography experiments, describe advantages and disadvantages of combining different selectivities and discuss strategies to improve the quality of two‐dimensional liquid chromatography separations. PMID:29027363
ERIC Educational Resources Information Center
Berkeley, Sheri; Larsen, Anna
2018-01-01
Thirty years of intervention research on the effects of reading comprehension strategies for students with learning disabilities was reviewed in this quantitative synthesis. Specifically, researchers targeted studies that contained self-regulated learning components. A systematic search yielded 18 studies from nine research journals that met…
ERIC Educational Resources Information Center
Aladjem, Daniel K.; LeFloch, Kerstin Carlson; Zhang, Yu; Kurki, Anja; Boyle, Andrea; Taylor, James E.; Herrmann, Suzannah; Uekawa, Kazuaki; Thomsen, Kerri; Fashola, Olatokunbo
2006-01-01
The National Longitudinal Evaluation of Comprehensive School Reform (NLECSR) is a quantitative and qualitative study of behavior, decisions, processes, and outcomes. It employs a quasi-experimental design with matched treatment and comparison schools. NLECSR seeks to determine the effects of CSR models on student achievement in about 650…
ERIC Educational Resources Information Center
Tetlan, W. Lou
2009-01-01
This study examined whether the design of textbook material affects comprehension and memory of textbook material under certain cognitive conditions for proficient and remedial readers. Using quantitative and qualitative research methods, format was found to significantly affect comprehension and memory. Proficient Male scored significantly…
The Effects of Audiobooks on EFL Students' Listening Comprehension
ERIC Educational Resources Information Center
Kartal, Galip; Simsek, Harun
2017-01-01
This study investigates the effects of audiobooks on listening comprehension skills of EFL Students, and their attitudes towards using audiobooks in a foreign language classroom. The participants are 66 first-year students of a state university in Turkey. The research follows a pre- post-test control group research design using quantitative and…
Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation
Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin
2013-01-01
With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144
Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite
Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.
2012-01-01
Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347
Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L; Ilangovan, Govindasamy
2006-12-15
Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO(2)-independent respiration at higher pO(2) ranges, pO(2)-dependent respiration at low pO(2) ranges, and a static equilibrium with no change in pO(2) at very low pO(2) values. The approach here enables one to comprehensively analyze all of the three zones together-where the progression of O(2) diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO(2) data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry.
Becker, M; Zweckmair, T; Forneck, A; Rosenau, T; Potthast, A; Liebner, F
2013-03-15
Gas chromatographic analysis of complex carbohydrate mixtures requires highly effective and reliable derivatisation strategies for successful separation, identification, and quantitation of all constituents. Different single-step (per-trimethylsilylation, isopropylidenation) and two-step approaches (ethoximation-trimethylsilylation, ethoximation-trifluoroacetylation, benzoximation-trimethylsilylation, benzoximation-trifluoroacetylation) have been comprehensively studied with regard to chromatographic characteristics, informational value of mass spectra, ease of peak assignment, robustness toward matrix effects, and quantitation using a set of reference compounds that comprise eight monosaccharides (C(5)-C(6)), glycolaldehyde, and dihydroxyacetone. It has been shown that isopropylidenation and the two oximation-trifluoroacetylation approaches are least suitable for complex carbohydrate matrices. Whereas the former is limited to compounds that contain vicinal dihydroxy moieties in cis configuration, the latter two methods are sensitive to traces of trifluoroacetic acid which strongly supports decomposition of ketohexoses. It has been demonstrated for two "real" carbohydrate-rich matrices of biological and synthetic origin, respectively, that two-step ethoximation-trimethylsilylation is superior to other approaches due to the low number of peaks obtained per carbohydrate, good peak separation performance, structural information of mass spectra, low limits of detection and quantitation, minor relative standard deviations, and low sensitivity toward matrix effects. Copyright © 2013 Elsevier B.V. All rights reserved.
Byrnes, Michelle; Beilby, Janet; Ray, Patricia; McLennan, Renee; Ker, John; Schug, Stephan
2012-12-01
To evaluate the process and outcome of a multidisciplinary inpatient goal planning rehabilitation programme on physical, social and psychological functioning for patients with spinal cord injury. Clinical audit: quantitative and qualitative analyses. Specialist spinal injury unit, Perth, Australia. Consecutive series of 100 newly injured spinal cord injury inpatients. MAIN MEASURE(S): The Needs Assessment Checklist (NAC), patient-focused goal planning questionnaire and goal planning progress form. The clinical audit of 100 spinal cord injured patients revealed that 547 goal planning meetings were held with 8531 goals stipulated in total. Seventy-five per cent of the goals set at the first goal planning meeting were achieved by the second meeting and the rate of goal achievements at subsequent goal planning meetings dropped to 56%. Based on quantitative analysis of physical, social and psychological functioning, the 100 spinal cord injury patients improved significantly from baseline to discharge. Furthermore, qualitative analysis revealed benefits consistently reported by spinal cord injury patients of the goal planning rehabilitation programme in improvements to their physical, social and psychological adjustment to injury. The findings of this clinical audit underpin the need for patient-focused goal planning rehabilitation programmes which are tailored to the individual's needs and involve a comprehensive multidisciplinary team.
Julka, Samir; Cortes, Hernan; Harfmann, Robert; Bell, Bruce; Schweizer-Theobaldt, Andreas; Pursch, Matthias; Mondello, Luigi; Maynard, Shawn; West, David
2009-06-01
A comprehensive multidimensional liquid chromatography system coupled to Electrospray Ionization-Mass Spectrometry (LCxLC-ESI-MS) was developed for detailed characterization and quantitation of solid epoxy resin components. The two orthogonal modes of separation selected were size exclusion chromatography (SEC) in the first dimension and liquid chromatography at critical conditions (LCCC) in the second dimension. Different components present in the solid epoxy resins were separated and quantitated for the first time based on the functional groups and molecular weight heterogeneity. Coupling LCxLC separations with mass spectrometry enabled the identification of components resolved in the two-dimensional space. Several different functional group families of compounds were separated and identified, including epoxy-epoxy and epoxy-alpha-glycol functional oligomers, and their individual molecular weight ranges were determined. Repeatability obtained ranged from 0.5% for the main product to 21% for oligomers at the 0.4% concentration level.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Claus, Maren; Dychus, Nicole; Ebel, Melanie; Damaschke, Jürgen; Maydych, Viktoriya; Wolf, Oliver T; Kleinsorge, Thomas; Watzl, Carsten
2016-10-01
The immune system is essential to provide protection from infections and cancer. Disturbances in immune function can therefore directly affect the health of the affected individual. Many extrinsic and intrinsic factors such as exposure to chemicals, stress, nutrition and age have been reported to influence the immune system. These influences can affect various components of the immune system, and we are just beginning to understand the causalities of these changes. To investigate such disturbances, it is therefore essential to analyze the different components of the immune system in a comprehensive fashion. Here, we demonstrate such an approach which provides information about total number of leukocytes, detailed quantitative and qualitative changes in the composition of lymphocyte subsets, cytokine levels in serum and functional properties of T cells, NK cells and monocytes. Using samples from a cohort of 24 healthy volunteers, we demonstrate the feasibility of our approach to detect changes in immune functions.
Comprehensive assay of kinase catalytic activity reveals features of kinase inhibitor selectivity
Anastassiadis, Theonie; Deacon, Sean W.; Devarajan, Karthik; Ma, Haiching; Peterson, Jeffrey R.
2011-01-01
Small-molecule protein kinase inhibitors are central tools for elucidating cellular signaling pathways and are promising therapeutic agents. Due to evolutionary conservation of the ATP-binding site, most kinase inhibitors that target this site promiscuously inhibit multiple kinases. Interpretation of experiments utilizing these compounds is confounded by a lack of data on the comprehensive kinase selectivity of most inhibitors. Here we profiled the activity of 178 commercially available kinase inhibitors against a panel of 300 recombinant protein kinases using a functional assay. Quantitative analysis revealed complex and often unexpected kinase-inhibitor interactions, with a wide spectrum of promiscuity. Many off-target interactions occur with seemingly unrelated kinases, revealing how large-scale profiling can be used to identify multi-targeted inhibitors of specific, diverse kinases. The results have significant implications for drug development and provide a resource for selecting compounds to elucidate kinase function and for interpreting the results of experiments that use them. PMID:22037377
Weidner, Christopher; Fischer, Cornelius; Sauer, Sascha
2014-12-01
We introduce PHOXTRACK (PHOsphosite-X-TRacing Analysis of Causal Kinases), a user-friendly freely available software tool for analyzing large datasets of post-translational modifications of proteins, such as phosphorylation, which are commonly gained by mass spectrometry detection. In contrast to other currently applied data analysis approaches, PHOXTRACK uses full sets of quantitative proteomics data and applies non-parametric statistics to calculate whether defined kinase-specific sets of phosphosite sequences indicate statistically significant concordant differences between various biological conditions. PHOXTRACK is an efficient tool for extracting post-translational information of comprehensive proteomics datasets to decipher key regulatory proteins and to infer biologically relevant molecular pathways. PHOXTRACK will be maintained over the next years and is freely available as an online tool for non-commercial use at http://phoxtrack.molgen.mpg.de. Users will also find a tutorial at this Web site and can additionally give feedback at https://groups.google.com/d/forum/phoxtrack-discuss. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Mondello, Luigi; Casillia, Alessandro; Tranchida, Peter Quinto; Dugo, Giovanni; Dugo, Paola
2005-03-04
Single column gas chromatography (GC) in combination with a flame ionization detector (FID) and/or a mass spectrometer is routinely employed in the determination of perfume profiles. The latter are to be considered medium to highly complex matrices and, as such, can only be partially separated even on long capillaries. Inevitably, several monodimensional peaks are the result of two or more overlapping components, often hindering reliable identification and quantitation. The present investigation is based on the use of a comprehensive GC (GC x GC) method, in vacuum outlet conditions, for the near to complete resolution of a complex perfume sample. A rapid scanning quadrupole mass spectrometry (qMS) system, employed for the assignment of GC x GC peaks, supplied high quality mass spectra. The validity of the three-dimensional (3D) GC x GC-qMS application was measured and compared to that of GC-qMS analysis on the same matrix. Peak identification, in all applications, was achieved through MS spectra library matching and the interactive use of linear retention indices (LRI).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Comprehensive analysis of mouse retinal mononuclear phagocytes.
Lückoff, Anika; Scholz, Rebecca; Sennlaub, Florian; Xu, Heping; Langmann, Thomas
2017-06-01
The innate immune system is activated in a number of degenerative and inflammatory retinal disorders such as age-related macular degeneration (AMD). Retinal microglia, choroidal macrophages, and recruited monocytes, collectively termed 'retinal mononuclear phagocytes', are critical determinants of ocular disease outcome. Many publications have described the presence of these cells in mouse models for retinal disease; however, only limited aspects of their behavior have been uncovered, and these have only been uncovered using a single detection method. The workflow presented here describes a comprehensive analysis strategy that allows characterization of retinal mononuclear phagocytes in vivo and in situ. We present standardized working steps for scanning laser ophthalmoscopy of microglia from MacGreen reporter mice (mice expressing the macrophage colony-stimulating factor receptor GFP transgene throughout the mononuclear phagocyte system), quantitative analysis of Iba1-stained retinal sections and flat mounts, CD11b-based retinal flow cytometry, and qRT-PCR analysis of key microglia markers. The protocol can be completed within 3 d, and we present data from retinas treated with laser-induced choroidal neovascularization (CNV), bright white-light exposure, and Fam161a-associated inherited retinal degeneration. The assays can be applied to any of the existing mouse models for retinal disorders and may be valuable for documenting immune responses in studies for immunomodulatory therapies.
Lee, Jaime B; Sohlberg, McKay Moore; Harn, Beth; Horner, Robert; Cherney, Leora R
2018-06-04
People with aphasia frequently present with nonlinguistic deficits, in addition to their compromised language abilities, which may contribute to their problems with reading comprehension. Treatment of attention, working memory and executive control may improve reading comprehension in individuals with aphasia, particularly those with mild reading problems. This single-case experimental design study evaluated the efficacy of Attention Process Training-3, an intervention combining direct attention training and metacognitive facilitation, for improving reading comprehension in individuals with mild aphasia. A multiple baseline design across six participants was used to evaluate treatment effects. The primary outcome measure was a maze reading task. Cognitive measures were administered pre- and post-treatment. Visual inspection of graphed maze reading performance data indicated a basic effect between APT-3 and improved maze reading for three of the six participants. Quantitative analyses, using Tau-U, corroborated findings identified through visual analysis. The overall effect size was significant (Tau = .48, p = .01). Results suggest that APT-3 has the potential to improve reading in individuals with aphasia, but that it may be more efficacious under certain conditions. Treatment and participant variables, including intensity of treatment and metacognitive strategy usage, are discussed as potential influences on participants' responsiveness to APT-3.
ERIC Educational Resources Information Center
Pilten, Gulhiz
2016-01-01
The purpose of the present research is investigating the effects of reciprocal teaching in comprehending expository texts. The research was designed with mixed method. The quantitative dimension of the present research was designed in accordance with pre-test-post-test control group experiment model. The quantitative dimension of the present…
Megger, Dominik A.; Philipp, Jos; Le-Trilling, Vu Thuy Khanh; Sitek, Barbara; Trilling, Mirko
2017-01-01
Interferons (IFNs) are pleotropic cytokines secreted upon encounter of pathogens and tumors. Applying their antipathogenic, antiproliferative, and immune stimulatory capacities, recombinant IFNs are frequently prescribed as drugs to treat different diseases. IFNs act by changing the gene expression profile of cells. Due to characteristics such as rapid gene induction and signaling, IFNs also represent prototypical model systems for various aspects of biomedical research (e.g., signal transduction). In regard to the signaling and activated promoters, IFNs can be subdivided into two groups. Here, alterations of the cellular proteome of human cells treated with IFNα and IFNγ were elucidated in a time-resolved manner by quantitative proteome analysis. The majority of protein regulations were strongly IFN type and time dependent. In addition to the expected upregulation of IFN-responsive proteins, an astonishing number of proteins became profoundly repressed especially by IFNγ. Thus, our comprehensive analysis revealed important insights into the human IFN-regulated proteome and its dynamics of protein induction and repression. Interestingly, the new class of IFN-repressed genes comprises known host factors for highly relevant pathogens such as HIV, dengue virus, and hepatitis C virus. PMID:28959263
Quantitative high-performance liquid chromatography of nucleosides in biological materials.
Gehrke, C W; Kuo, K C; Davis, G E; Suits, R D; Waalkes, T P; Borek, E
1978-03-21
A rigorous, comprehensive, and reliable reversed-phase high-performance liquid chromatographic (HPLC) method has been developed for the analysis of ribonucleosides in urine (psi, m1A, m1I, m2G, A, m2(2)G). An initial isolation of ribonucleosides with an affinity gel containing an immobilized phenylboronic acid was used to improve selectivity and sensitivity. Response for all nucleosides was linear from 0.1 to 50 nmoles injected and good quantitation was obtained for 25 microliter or less of sample placed on the HPLC column. Excellent precision of analysis for urinary nucleosides was achieved on matrix dependent and independent samples, and the high resolution of the reversed-phase column allowed the complete separation of 9 nucleosides from other unidentified UV absorbing components at the 1-ng level. Supporting experimental data are presented on precision, recovery, chromatographic methods, minimum detection limit, retention time, relative molar response, sample clean-up, stability of nucleosides, boronate gel capacity, and application to analysis of urine from patients with leukemia and breast cancer. This method is now being used routinely for the determination of the concentration and ratios of nucleosides in urine from patients with different types of cancer and in chemotherapy response studies.
NASA Astrophysics Data System (ADS)
Peng, Tiefeng; Liu, Bin; Gao, Xuechao; Luo, Liqun; Sun, Hongjuan
2018-06-01
Expandable graphite is widely used as a new functional carbon material, especially as fire-retardant; however, its practical application is limited due to the high expansion temperature. In this work, preparation process of low temperature and highly expandable graphite was studied, using natural flake graphite as raw material and KMnO4/HClO4/NH4NO3 as oxidative intercalations. The structure, morphology, functional groups and thermal properties were characterized during expanding process by Fourier transform infrared spectroscopy (FTIR), Raman spectra, thermo-gravimetry differential scanning calorimetry (TG-DSC), X-ray diffraction (XRD), and scanning electron microscope (SEM). The analysis showed that by oxidation intercalation, some oxygen-containing groups were grafted on the edge and within the graphite layer. The intercalation reagent entered the graphite layer to increase the interlayer spacing. After expansion, the original flaky expandable graphite was completely transformed into worm-like expanded graphite. The order of graphite intercalation compounds (GICs) was proposed and determined to be 3 for the prepared expandable graphite, based on quantitative XRD peak analysis. Meanwhile, the detailed intercalation mechanisms were also proposed. The comprehensive investigation paved a benchmark for the industrial application of such sulfur-free expanded graphite.
Megger, Dominik A; Philipp, Jos; Le-Trilling, Vu Thuy Khanh; Sitek, Barbara; Trilling, Mirko
2017-01-01
Interferons (IFNs) are pleotropic cytokines secreted upon encounter of pathogens and tumors. Applying their antipathogenic, antiproliferative, and immune stimulatory capacities, recombinant IFNs are frequently prescribed as drugs to treat different diseases. IFNs act by changing the gene expression profile of cells. Due to characteristics such as rapid gene induction and signaling, IFNs also represent prototypical model systems for various aspects of biomedical research (e.g., signal transduction). In regard to the signaling and activated promoters, IFNs can be subdivided into two groups. Here, alterations of the cellular proteome of human cells treated with IFNα and IFNγ were elucidated in a time-resolved manner by quantitative proteome analysis. The majority of protein regulations were strongly IFN type and time dependent. In addition to the expected upregulation of IFN-responsive proteins, an astonishing number of proteins became profoundly repressed especially by IFNγ. Thus, our comprehensive analysis revealed important insights into the human IFN-regulated proteome and its dynamics of protein induction and repression. Interestingly, the new class of IFN-repressed genes comprises known host factors for highly relevant pathogens such as HIV, dengue virus, and hepatitis C virus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, Karisa M.; Wood, Lianna F.; Wright, Bob W.
2005-12-01
A comprehensive two-dimensional (2D) retention time alignment algorithm was developed using a novel indexing scheme. The algorithm is termed comprehensive because it functions to correct the entire chromatogram in both dimensions and it preserves the separation information in both dimensions. Although the algorithm is demonstrated by correcting comprehensive two-dimensional gas chromatography (GC x GC) data, the algorithm is designed to correct shifting in all forms of 2D separations, such as LC x LC, LC x CE, CE x CE, and LC x GC. This 2D alignment algorithm was applied to three different data sets composed of replicate GC x GCmore » separations of (1) three 22-component control mixtures, (2) three gasoline samples, and (3) three diesel samples. The three data sets were collected using slightly different temperature or pressure programs to engender significant retention time shifting in the raw data and then demonstrate subsequent corrections of that shifting upon comprehensive 2D alignment of the data sets. Thirty 12-min GC x GC separations from three 22-component control mixtures were used to evaluate the 2D alignment performance (10 runs/mixture). The average standard deviation of the first column retention time improved 5-fold from 0.020 min (before alignment) to 0.004 min (after alignment). Concurrently, the average standard deviation of second column retention time improved 4-fold from 3.5 ms (before alignment) to 0.8 ms (after alignment). Alignment of the 30 control mixture chromatograms took 20 min. The quantitative integrity of the GC x GC data following 2D alignment was also investigated. The mean integrated signal was determined for all components in the three 22-component mixtures for all 30 replicates. The average percent difference in the integrated signal for each component before and after alignment was 2.6%. Singular value decomposition (SVD) was applied to the 22-component control mixture data before and after alignment to show the restoration of trilinearity to the data, since trilinearity benefits chemometric analysis. By applying comprehensive 2D retention time alignment to all three data sets (control mixtures, gasoline samples, and diesel samples), classification by principal component analysis (PCA) substantially improved, resulting in 100% accurate scores clustering.« less
ERIC Educational Resources Information Center
Sondergeld, Toni A.; Koskey, Kristin L. K.
2011-01-01
An abundance of comprehensive school reform (CSR) literature exists illustrating CSRs are effective in improving student outcomes. However, much of this research reports on top-down reforms, focuses on academic outcomes, and uses quantitative methods alone. Many educational researchers have argued for the use of mixed methods for providing a…
A model of comprehensive unification
NASA Astrophysics Data System (ADS)
Reig, Mario; Valle, José W. F.; Vaquera-Araujo, C. A.; Wilczek, Frank
2017-11-01
Comprehensive - that is, gauge and family - unification using spinors has many attractive features, but it has been challenged to explain chirality. Here, by combining an orbifold construction with more traditional ideas, we address that difficulty. Our candidate model features three chiral families and leads to an acceptable result for quantitative unification of couplings. A potential target for accelerator and astronomical searches emerges.
ERIC Educational Resources Information Center
Foley, Laura S.
2011-01-01
This research investigated factors that influence the implementation levels of evidence-based comprehension strategy instruction (CSI) among K-3 teachers. An explanatory design was chosen to gather and probe the data. Quantitative data were gathered via a mailed survey distributed through a representative sample of the 40 school districts (through…
Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation
ERIC Educational Resources Information Center
Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine
2006-01-01
This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…
A Case Study of the Impact of Guided Reading Groups in Second Grade on Comprehension Improvement
ERIC Educational Resources Information Center
Lorent Deegan, Chanin E.
2010-01-01
This study combined both qualitative and quantitative research to determine the impact of instructional practices on comprehension improvement in second grade Guided Reading groups. Four second grade teachers and their 73 students ages seven through eight years old participated in this study. Additionally, the study examined the effects of Guided…
Does mClass Reading 3D Predict Student Reading Proficiency on High-Stakes Assessments?
ERIC Educational Resources Information Center
Bowles, Amy S.
2015-01-01
This quantitative, correlational study investigated the relationship between the North Carolina End of Grade Assessment of Reading Comprehension (NCEOG) and mClass Reading 3D assessment in a North Carolina elementary school. It especially examined the degree to which mClass Reading 3D measures predict scores on the reading comprehension portion of…
Wu, Xu; Zhu, Lin; Ma, Jiang; Ye, Yang; Lin, Ge
2017-10-25
Polyoxypregnane and its glycosides (POPs) are frequently present in plants of Asclepiadaceae family, and have a variety of biological activities. There is a great need to comprehensively profile these phytochemicals and to quantify them for monitoring their contents in the herbs and the biological samples. However, POPs undergo extensive adduct ion formation in ESI-MS, which has posed a challenge for qualitative and quantitative analysis of POPs. In the present study, we took the advantage of such extensive adduct ion formation to investigate the suitability of adduct ion-targeted analysis of POPs. For the qualitative analysis, we firstly demonstrated that the sodium and ammonium adduct ion-targeted product ion scans (PIS) provided adequate MS/MS fragmentations for structural characterization of POPs. Aided with precursor ion (PI) scans, which showed high selectivity and sensitivity and improved peak assignment confidence in conjunction with full scan (FS), the informative adduct ion-targeted PIS enabled rapid POPs profiling. For the quantification, we used formic acid rather than ammonium acetate as an additive in the mobile phase to avoid simultaneous formation of sodium and ammonium adduct ions, and greatly improved reproducibility of MS response of POPs. By monitoring the solely formed sodium adduct ions [M+Na] + , a method for simultaneous quantification of 25 POPs in the dynamic multiple reaction monitoring mode was then developed and validated. Finally, the aforementioned methods were applied to qualitative and quantitative analysis of POPs in the extract of a traditional Chinses medicinal herb, Marsdenia tenacissima (Roxb.) Wight et Arn., and in the plasma obtained from the rats treated with this herb. The results demonstrated that adduct ion formation could be optimized for the qualitative and quantitative analysis of POPs, and our developed PI/FS-PIS scanning and sole [M+Na] + ion monitoring significantly improved the analysis of POPs in both herbal and biological samples. This study also provides implications for the analysis of other compounds which undergo extensive adduct ion formation in ESI-MS. Copyright © 2017 Elsevier B.V. All rights reserved.
Kudo, Toru; Sasaki, Yohei; Terashima, Shin; Matsuda-Imai, Noriko; Takano, Tomoyuki; Saito, Misa; Kanno, Maasa; Ozaki, Soichi; Suwabe, Keita; Suzuki, Go; Watanabe, Masao; Matsuoka, Makoto; Takayama, Seiji; Yano, Kentaro
2016-10-13
In quantitative gene expression analysis, normalization using a reference gene as an internal control is frequently performed for appropriate interpretation of the results. Efforts have been devoted to exploring superior novel reference genes using microarray transcriptomic data and to evaluating commonly used reference genes by targeting analysis. However, because the number of specifically detectable genes is totally dependent on probe design in the microarray analysis, exploration using microarray data may miss some of the best choices for the reference genes. Recently emerging RNA sequencing (RNA-seq) provides an ideal resource for comprehensive exploration of reference genes since this method is capable of detecting all expressed genes, in principle including even unknown genes. We report the results of a comprehensive exploration of reference genes using public RNA-seq data from plants such as Arabidopsis thaliana (Arabidopsis), Glycine max (soybean), Solanum lycopersicum (tomato) and Oryza sativa (rice). To select reference genes suitable for the broadest experimental conditions possible, candidates were surveyed by the following four steps: (1) evaluation of the basal expression level of each gene in each experiment; (2) evaluation of the expression stability of each gene in each experiment; (3) evaluation of the expression stability of each gene across the experiments; and (4) selection of top-ranked genes, after ranking according to the number of experiments in which the gene was expressed stably. Employing this procedure, 13, 10, 12 and 21 top candidates for reference genes were proposed in Arabidopsis, soybean, tomato and rice, respectively. Microarray expression data confirmed that the expression of the proposed reference genes under broad experimental conditions was more stable than that of commonly used reference genes. These novel reference genes will be useful for analyzing gene expression profiles across experiments carried out under various experimental conditions.
Comprehensive Analysis of the Gas- and Particle-Phase Products of VOC Oxidation
NASA Astrophysics Data System (ADS)
Bakker-Arkema, J.; Ziemann, P. J.
2017-12-01
Controlled environmental chamber studies are important for determining atmospheric reaction mechanisms and gas and aerosol products formed in the oxidation of volatile organic compounds (VOCs). Such information is necessary for developing detailed chemical models for use in predicting the atmospheric fate of VOCs and also secondary organic aerosol (SOA) formation. However, complete characterization of atmospheric oxidation reactions, including gas- and particle-phase product yields, and reaction branching ratios, are difficult to achieve. In this work, we investigated the reactions of terminal and internal alkenes with OH radicals in the presence of NOx in an attempt to fully characterize the chemistry of these systems while minimizing and accounting for the inherent uncertainties associated with environmental chamber experiments. Gas-phase products (aldehydes formed by alkoxy radical decomposition) and particle-phase products (alkyl nitrates, β-hydroxynitrates, dihydroxynitrates, 1,4-hydroxynitrates, 1,4-hydroxycarbonyls, and dihydroxycarbonyls) formed through pathways involving addition of OH to the C=C double bond as well as H-atom abstraction were identified and quantified using a suite of analytical techniques. Particle-phase products were analyzed in real time with a thermal desorption particle beam mass spectrometer; and off-line by collection onto filters, extraction, and subsequent analysis of functional groups by derivatization-spectrophotometric methods developed in our lab. Derivatized products were also separated by liquid chromatography for molecular quantitation by UV absorbance and identification using chemical ionization-ion trap mass spectrometry. Gas phase aldehydes were analyzed off-line by collection onto Tenax and a 5-channel denuder with subsequent analysis by gas chromatography, or by collection onto DNPH-coated cartridges and subsequent analysis by liquid chromatography. The full product identification and quantitation, with careful minimization of uncertainties for the various components of the experiment and analyses, demonstrates our capability to comprehensively and accurately analyze the complex chemical composition of products formed in the oxidation of organic compounds in laboratory chamber studies.
Ehling, Stefan; Reddy, Todime M
2014-02-19
A simple, rugged, quantitative, and confirmatory method based on liquid chromatography-mass spectrometry was developed and comprehensively validated for the analysis of the leucine metabolites β-hydroxy-β-methylbutyric acid (HMB) and α-hydroxyisocaproic acid (HICA) in bovine whole milk and yogurt. Mean accuracy (90-110% for HMB and 85-115% for HICA) and total precision (<10% RSD in most cases, except for <20% RSD for HMB at the limit of quantitation) at four concentration levels across three validation runs have been determined. Limits of quantitation for HMB and HICA in whole milk were 20 and 5 μg/L, respectively. Measured concentrations of HMB and HICA were <20-29 and 32-37 μg/L, respectively, in bovine whole milk and <5 and 3.0-15.2 mg/L, respectively, in yogurt. These concentrations are insufficient by large margins to deliver any musculoskeletal benefits, and fortification of milk and dairy products with HMB and/or HICA appears to be justified.
Reference condition approach to restoration planning
Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.
2010-01-01
Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.
Kazarian, Artaches A; Taylor, Mark R; Haddad, Paul R; Nesterenko, Pavel N; Paull, Brett
2013-12-01
The comprehensive separation and detection of hydrophobic and hydrophilic active pharmaceutical ingredients (APIs), their counter-ions (organic, inorganic) and excipients, using a single mixed-mode chromatographic column, and a dual injection approach is presented. Using a mixed-mode Thermo Fisher Acclaim Trinity P1 column, APIs, their counter-ions and possible degradants were first separated using a combination of anion-exchange, cation-exchange and hydrophobic interactions, using a mobile phase consisting of a dual organic modifier/salt concentration gradient. A complementary method was also developed using the same column for the separation of hydrophilic bulk excipients, using hydrophilic interaction liquid chromatography (HILIC) under high organic solvent mobile phase conditions. These two methods were then combined within a single gradient run using dual sample injection, with the first injection at the start of the applied gradient (mixed-mode retention of solutes), followed by a second sample injection at the end of the gradient (HILIC retention of solutes). Detection using both ultraviolet absorbance and refractive index enabled the sensitive detection of APIs and UV-absorbing counter-ions, together with quantitative determination of bulk excipients. The developed approach was applied successfully to the analysis of a dry powder inhalers (Flixotide(®), Spiriva(®)), enabling comprehensive quantification of all APIs and excipients in the sample. Copyright © 2013 Elsevier B.V. All rights reserved.
Liu, Xuejiao; Zhang, Dongdong; Liu, Yu; Sun, Xizhuo; Han, Chengyi; Wang, Bingyuan; Ren, Yongcheng; Zhou, Junmei; Zhao, Yang; Shi, Yuanyuan; Hu, Dongsheng; Zhang, Ming
2017-05-01
Despite the inverse association between physical activity (PA) and incident hypertension, a comprehensive assessment of the quantitative dose-response association between PA and hypertension has not been reported. We performed a meta-analysis, including dose-response analysis, to quantitatively evaluate this association. We searched PubMed and Embase databases for articles published up to November 1, 2016. Random effects generalized least squares regression models were used to assess the quantitative association between PA and hypertension risk across studies. Restricted cubic splines were used to model the dose-response association. We identified 22 articles (29 studies) investigating the risk of hypertension with leisure-time PA or total PA, including 330 222 individuals and 67 698 incident cases of hypertension. The risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.96) with each 10 metabolic equivalent of task h/wk increment of leisure-time PA. We found no evidence of a nonlinear dose-response association of PA and hypertension ( P nonlinearity =0.094 for leisure-time PA and 0.771 for total PA). With the linear cubic spline model, when compared with inactive individuals, for those who met the guidelines recommended minimum level of moderate PA (10 metabolic equivalent of task h/wk), the risk of hypertension was reduced by 6% (relative risk, 0.94; 95% confidence interval, 0.92-0.97). This meta-analysis suggests that additional benefits for hypertension prevention occur as the amount of PA increases. © 2017 American Heart Association, Inc.
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-01-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-10-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.
Dataglove measurement of joint angles in sign language handshapes
Eccarius, Petra; Bour, Rebecca; Scheidt, Robert A.
2012-01-01
In sign language research, we understand little about articulatory factors involved in shaping phonemic boundaries or the amount (and articulatory nature) of acceptable phonetic variation between handshapes. To date, there exists no comprehensive analysis of handshape based on the quantitative measurement of joint angles during sign production. The purpose of our work is to develop a methodology for collecting and visualizing quantitative handshape data in an attempt to better understand how handshapes are produced at a phonetic level. In this pursuit, we seek to quantify the flexion and abduction angles of the finger joints using a commercial data glove (CyberGlove; Immersion Inc.). We present calibration procedures used to convert raw glove signals into joint angles. We then implement those procedures and evaluate their ability to accurately predict joint angle. Finally, we provide examples of how our recording techniques might inform current research questions. PMID:23997644
Westergaard, David; Stærfeldt, Hans-Henrik; Tønsberg, Christian; Jensen, Lars Juhl; Brunak, Søren
2018-02-01
Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823-2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein-protein, disease-gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only.
Westergaard, David; Stærfeldt, Hans-Henrik
2018-01-01
Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823–2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein–protein, disease–gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only. PMID:29447159
The quest for improved reproducibility in MALDI mass spectrometry.
O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P
2018-03-01
Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.
Quantification of local and global benefits from air pollution control in Mexico City.
Mckinley, Galen; Zuk, Miriam; Höjer, Morten; Avalos, Montserrat; González, Isabel; Iniestra, Rodolfo; Laguna, Israel; Martínez, Miguel A; Osnaya, Patricia; Reynales, Luz M; Valdés, Raydel; Martínez, Julia
2005-04-01
Complex sociopolitical, economic, and geographical realities cause the 20 million residents of Mexico City to suffer from some of the worst air pollution conditions in the world. Greenhouse gas emissions from the city are also substantial, and opportunities for joint local-global air pollution control are being sought. Although a plethora of measures to improve local air quality and reduce greenhouse gas emissions have been proposed for Mexico City, resources are not available for implementation of all proposed controls and thus prioritization must occur. Yet policy makers often do not conduct comprehensive quantitative analyses to inform these decisions. We reanalyze a subset of currently proposed control measures, and derive cost and health benefit estimates that are directly comparable. This study illustrates that improved quantitative analysis can change implementation prioritization for air pollution and greenhouse gas control measures in Mexico City.
Gupta, B L
1991-06-01
This review surveys the emergence of electron probe X-ray microanalysis as a quantitative method for measuring the chemical elements in situ. The extension of the method to the biological sciences under the influence of Ted Hall is reviewed. Some classical experiments by Hall and his colleagues in Cambridge, UK, previously unpublished, are described; as are some of the earliest quantitative results from the cryo-sections obtained in Cambridge and elsewhere. The progress of the methodology is critically evaluated from the earliest starts to the present state of the art. Particular attention has been focused on the application of the method in providing fresh insights into the role of ions in cell and tissue physiology and pathology. A comprehensive list of references is included for a further pursuit of the topics by the interested reader.
Aroma Characterization and Safety Assessment of a Beverage Fermented by Trametes versicolor.
Zhang, Yanyan; Fraatz, Marco Alexander; Müller, Julia; Schmitz, Hans-Joachim; Birk, Florian; Schrenk, Dieter; Zorn, Holger
2015-08-12
A cereal-based beverage was developed by fermentation of wort with the basidiomycete Trametes versicolor. The beverage possessed a fruity, fresh, and slightly floral aroma. The volatiles of the beverage were isolated by liquid-liquid extraction (LLE) and additionally by headspace solid phase microextraction (HS-SPME). The aroma compounds were analyzed by a gas chromatography system equipped with a tandem mass spectrometer and an olfactory detection port (GC-MS/MS-O) followed by aroma (extract) dilution analysis. Thirty-four different odor impressions were perceived, and 27 corresponding compounds were identified. Fifteen key odorants with flavor dilution (FD) factors ranging from 8 to 128 were quantitated, and their respective odor activity values (OAVs) were calculated. Six key odorants were synthesized de novo by T. versicolor. Furthermore, quantitative changes during the fermentation process were analyzed. To prepare for the market introduction of the beverage, a comprehensive safety assessment was performed.
Davies, Stephen R; Jones, Kai; Goldys, Anna; Alamgir, Mahuiddin; Chan, Benjamin K H; Elgindy, Cecile; Mitchell, Peter S R; Tarrant, Gregory J; Krishnaswami, Maya R; Luo, Yawen; Moawad, Michael; Lawes, Douglas; Hook, James M
2015-04-01
Quantitative NMR spectroscopy (qNMR) has been examined for purity assessment using a range of organic calibration standards of varying structural complexities, certified using the traditional mass balance approach. Demonstrated equivalence between the two independent purity values confirmed the accuracy of qNMR and highlighted the benefit of using both methods in tandem to minimise the potential for hidden bias, thereby conferring greater confidence in the overall purity assessment. A comprehensive approach to purity assessment is detailed, utilising, where appropriate, multiple peaks in the qNMR spectrum, chosen on the basis of scientific reason and statistical analysis. Two examples are presented in which differences between the purity assignment by qNMR and mass balance are addressed in different ways depending on the requirement of the end user, affording fit-for-purpose calibration standards in a cost-effective manner.
Poeppl, Timm B.; Langguth, Berthold; Laird, Angela R.; Eickhoff, Simon B.
2016-01-01
Reproductive behavior is mandatory for conservation of species and mediated by a state of sexual arousal (SA), involving both complex mental processes and bodily reactions. An early neurobehavioral model of SA proposes cognitive, emotional, motivational, and autonomic components. In a comprehensive quantitative meta-analysis on previous neuroimaging findings, we provide here evidence for distinct brain networks underlying psychosexual and physiosexual arousal. Psychosexual (i.e., mental sexual) arousal recruits brain areas crucial for cognitive evaluation, top-down modulation of attention and exteroceptive sensory processing, relevance detection and affective evaluation, as well as regions implicated in the representation of urges and in triggering autonomic processes. In contrast, physiosexual (i.e., physiological sexual) arousal is mediated by regions responsible for regulation and monitoring of initiated autonomic processes and emotions and for somatosensory processing. These circuits are interconnected by subcortical structures (putamen and claustrum) that provide exchange of sensorimotor information and crossmodal processing between and within the networks. Brain deactivations may imply attenuation of introspective processes and social cognition, but be necessary to release intrinsic inhibition of SA. PMID:23674246
Wang, Jiaming; Gambetta, Joanna M; Jeffery, David W
2016-05-18
Two rosé wines, representing a tropical and a fruity/floral style, were chosen from a previous study for further exploration by aroma extract dilution analysis (AEDA) and quantitative analysis. Volatiles were extracted using either liquid-liquid extraction (LLE) followed by solvent-assisted flavor evaporation (SAFE) or a recently developed dynamic headspace (HS) sampling method utilizing solid-phase extraction (SPE) cartridges. AEDA was conducted using gas chromatography-mass spectrometry/olfactometry (GC-MS/O) and a total of 51 aroma compounds with a flavor dilution (FD) factor ≥3 were detected. Quantitative analysis of 92 volatiles was undertaken in both wines for calculation of odor activity values. The fruity and floral wine style was mostly driven by 2-phenylethanol, β-damascenone, and a range of esters, whereas 3-SHA and several volatile acids were seen as essential for the tropical style. When extraction methods were compared, HS-SPE was as efficient as SAFE for extracting most esters and higher alcohols, which were associated with fruity and floral characters, but it was difficult to capture volatiles with greater polarity or higher boiling point that may still be important to perceived wine aroma.
Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.
Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak
2018-02-01
The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heyman, Heino M.; Zhang, Xing; Tang, Keqi
2016-02-16
Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoofnagle, Andrew N.; Whiteaker, Jeffrey R.; Carr, Steven A.
2015-12-30
The Clinical Proteomic Tumor Analysis Consortium (1) (CPTAC) of the National Cancer Institute (NCI) is a comprehensive and coordinated effort to accelerate the understanding of the molecular basis of cancer through the application of robust technologies and workflows for the quantitative measurements of proteins. The Assay Development Working Group of the CPTAC Program aims to foster broad uptake of targeted mass spectrometry-based assays employing isotopically labeled peptides for confident assignment and quantification, including multiple reaction monitoring (MRM; also referred to as Selected Reaction Monitoring), parallel reaction monitoring (PRM), and other targeted methods.
ERIC Educational Resources Information Center
Gonzales, Wilkinson Daniel Wong; Torres, Patrisha Lliane O.
2016-01-01
Departing from Gonzales and Torres' (2015) quantitative quasi-experimental study on the effectiveness of cooperative learning (CL) on Grade 8 students in a private school in Manila, the present study attempts to uncover possible findings in the affective dimension through a 25-item survey questionnaire that quantitatively gauges the Grade 8…
ERIC Educational Resources Information Center
Dillon, Andrew; Gabbard, Ralph
1998-01-01
Published studies of hypermedia as an educational technology that have emphasized quantitative and empirical assessment of outcomes were reviewed. The evidence to date suggests that hypermedia benefits in education are limited to learning tasks depending on repeated manipulation and searching of information. Implications for instruction and…
Developing Quantitative Reasoning: Will Taking Traditional Math Courses Suffice? An Empirical Study
ERIC Educational Resources Information Center
Agustin, Ma Zenia; Agustin, Marcus; Brunkow, Paul; Thomas, Susan
2012-01-01
Southern Illinois University Edwardsville (SIUE) is a postbaccalaureate comprehensive university in the Midwest. In 2005, SIUE embarked on a challenging journey of general education reform. A review of the current general education program revealed that it is possible for a student to graduate from SIUE without taking a quantitative course. Hence,…
2015-01-01
Atropine, a muscarinic antagonist, is known to inhibit myopia progression in several animal models and humans. However, the mode of action is not established yet. In this study, we compared quantitative iTRAQ proteomic analysis in the retinas collected from control and lens-induced myopic (LIM) mouse eyes treated with atropine. The myopic group received a (−15D) spectacle lens over the right eye on postnatal day 10 with or without atropine eye drops starting on postnatal day 24. Axial length was measured by optical low coherence interferometry (OLCI), AC-Master, and refraction was measured by automated infrared photorefractor at postnatal 24, 38, and 52 days. Retinal tissue samples were pooled from six eyes for each group. The experiments were repeated twice, and technical replicates were also performed for liquid chromatography–tandem mass spectrometry (LC–MS/MS) analysis. MetaCore was used to perform protein profiling for pathway analysis. We identified a total of 3882 unique proteins with <1% FDR by analyzing the samples in replicates for two independent experiments. This is the largest number of mouse retina proteome reported to date. Thirty proteins were found to be up-regulated (ratio for myopia/control > global mean ratio + 1 standard deviation), and 28 proteins were down-regulated (ratio for myopia/control < global mean ratio - 1 standard deviation) in myopic eyes as compared with control retinas. Pathway analysis using MetaCore revealed regulation of γ-aminobutyric acid (GABA) levels in the myopic eyes. Detailed analysis of the quantitative proteomics data showed that the levels of GABA transporter 1 (GAT-1) were elevated in myopic retina and significantly reduced after atropine treatment. These results were further validated with immunohistochemistry and Western blot analysis. In conclusion, this study provides a comprehensive quantitative proteomic analysis of atropine-treated mouse retina and suggests the involvement of GABAergic signaling in the antimyopic effects of atropine in mouse eyes. The GABAergic transmission in the neural retina plays a pivotal role in the maintenance of axial eye growth in mammals. PMID:25211393
ERIC Educational Resources Information Center
Martinez-Alvarez, Patricia; Bannan, Brenda; Peters-Burton, Erin E.
2012-01-01
This research work examined the impact of a science-reading instructional sequence ("InSciRead") on fourth-grade Spanish-English dual language learners' ability to monitor their comprehension of content-related (erosion, deposition, and transportation) texts. We used a quantitative measure of students' ability to detect incongruities in a related…
ERIC Educational Resources Information Center
Belet Boyaci, S. Dilek; Güner, Mediha
2018-01-01
The objective of the present study was to determine the impact of authentic task-based authentic material on reading comprehension, writing skills and writing motivation in the Turkish language course. The study was conducted with mixed design methodology. Quantitative data were collected with the quasi-experimental with pre-test post-test with…
ERIC Educational Resources Information Center
Tous, Maryam Danaye; Tahriri, Abdorreza; Haghighi, Sara
2015-01-01
The purpose of the present study was to examine the effect of instruction through debate on male and female EFL learners' reading comprehension. Also, their perception of critical thinking (CT) instruction was investigated. A quantitative research method with experimental pre-and post-tests design was conducted to collect the data. Eighty-eight…
ERIC Educational Resources Information Center
Lencioni, Gina M.
2013-01-01
The purpose of this study was to investigate the effects of explicit direct instruction and cooperative learning on reading comprehension in fourth grade students. A quasi-experimental design was used. There were six cognitive and three affective measures used to collect quantitative data. Cognitive measures included California State Test scores,…
Effect of an Interactive Literacy Website on Struggling Readers' Comprehension
ERIC Educational Resources Information Center
Caldwell, Bettie Johnson
2013-01-01
The issue of struggling readers is a major concern in the field of education. An overwhelming number of students are beginning school ill-equipped to read. The purpose of this quantitative study was to explore the effect of an interactive literacy website on the reading comprehension scores of 2nd grade students identified as struggling readers.…
ERIC Educational Resources Information Center
Harris-Bingham, Tranae
2017-01-01
The purpose of this study is to examine the relationships among the level of comprehensive school counseling program implementation, school counselors' leadership practices, and their schools' Title I status. Existing research examining the level of program implementation and school counselor leadership practices is limited and the representation…
Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.
Rehbein, Peter; Schwalbe, Harald
2015-06-01
Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.
Enhancing Induction Coil Reliability
NASA Astrophysics Data System (ADS)
Kreter, K.; Goldstein, R.; Yakey, C.; Nemkov, V.
2014-12-01
In induction hardening, thermal fatigue is one of the main copper failure modes of induction heat treating coils. There have been papers published that describe this failure mode and others that describe some good design practices. The variables previously identified as the sources of thermal fatigue include radiation from the part surface, frequency, current, concentrator losses, water pressure and coil wall thickness. However, there is very little quantitative data on the factors that influence thermal fatigue in induction coils is available in the public domain. By using finite element analysis software this study analyzes the effect of common design variables of inductor cooling, and quantifies the relative importance of these variables. A comprehensive case study for a single shot induction coil with Fluxtrol A concentrator applied is used for the analysis.
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2015-07-01
We combine pulsed photothermal radiometry (PPTR) depth profiling with diffuse reflectance spectroscopy (DRS) measurements for a comprehensive analysis of bruise evolution in vivo. While PPTR enables extraction of detailed depth distribution and concentration profiles of selected absorbers (e.g. melanin, hemoglobin), DRS provides information in a wide range of visible wavelengths and thus offers an additional insight into dynamics of the hemoglobin degradation products. Combining the two approaches enables us to quantitatively characterize bruise evolution dynamics. Our results indicate temporal variations of the bruise evolution parameters in the course of bruise self-healing process. The obtained parameter values and trends represent a basis for a future development of an objective technique for bruise age determination.
Building a Definition of Irritability From Academic Definitions and Lay Descriptions.
Barata, Paula C; Holtzman, Susan; Cunningham, Shannon; O'Connor, Brian P; Stewart, Donna E
2016-04-08
The current work builds a definition of irritability from both academic definitions and lay perspectives. In Study 1, a quantitative content analysis of academic definitions resulted in eight main content categories (i.e., behaviour, emotion or affect, cognition, physiological, qualifiers, irritant, stability or endurance, and other). In Study 2, a community sample of 39 adults participated in qualitative interviews. A deductive thematic analysis resulted in two main themes. The first main theme dealt with how participants positioned irritability in relation to other negative states. The second dealt with how participants constructed irritability as both a loss of control and as an experience that should be controlled. The discussion integrates the findings of both studies and provides a concise, but comprehensive definition.
Toraman, Hilal E; Franz, Kristina; Ronsse, Frederik; Van Geem, Kevin M; Marin, Guy B
2016-08-19
Insight in the composition of the algae derived bio-oils is crucial for the development of efficient conversion processes and better upgrading strategies for microalgae. Comprehensive two-dimensional gas chromatography (GC×GC) coupled to nitrogen chemiluminescence detector (NCD) and time-of-flight mass spectrometer (TOF-MS) allows to obtain the detailed quantitative composition of the nitrogen containing compounds in the aqueous and the organic fraction of fast pyrolysis bio-oils from microalgae. Normal phase (apolar×mid-polar) and reverse phase column (polar×apolar) combination are investigated to optimize the separation of the detected nitrogen containing compounds. The reverse phase column combination gives the most detailed information in terms of the nitrogen containing compounds. The combined information from the GC×GC-TOF-MS (qualitative) and GC×GC-NCD (quantitative) with the use of a well-chosen internal standard, i.e. caprolactam, enables the identification and quantification of nitrogen containing compounds belonging to 13 different classes: amines, imidazoles, amides, imides, nitriles, pyrazines, pyridines, indoles, pyrazoles, pyrimidines, quinolines, pyrimidinediones and other nitrogen containing compounds which were not assigned to a specific class. The aqueous fraction mostly consists of amines (4.0wt%) and imidazoles (2.8wt%) corresponding to approximately 80wt% of the total identified nitrogen containing compounds. On the other hand, the organic fraction shows a more diverse distribution of nitrogen containing compounds with the majority of the compounds quantified as amides (3.0wt%), indoles (2.0wt%), amines (1.7wt%) and imides (1.3wt%) corresponding to approximately 65wt% of the total identified nitrogen containing compounds. Copyright © 2016 Elsevier B.V. All rights reserved.
Recommended health and safety guidelines for coal gasification pilot plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-01-01
The greatest toxic hazards in coal conversion are presented by the known and suspected carcinogens, because they are hazardous at low levels of exposure, have delayed action with no immediate warning, and have grave consequences. As for routes of entry, it is to be noted that various solids and liquids may reach the body by inhalation of particles, deposition of particles, or indirectly by contact with dirty surfaces. Other toxicants are most likely to enter the body by inhalation. The overall carcinogenic hazard cannot be precisely estimated from chemical analysis alone, because the possible interactions are far too complex. Further,more » the hazard cannot at present be quantitatively defined by available biological tests. The same limitations probably apply to toxic effects other than carcinogenesis, with the posible exception of some immediate responses (e.g., chemical asphyxia, primary respiratory irration). It is not practical to recommend comprehensive workplace exposure limits on a basis similar to those for individual toxicants; however, a limit for one important kind of hazard (high-boiling suspected carcinogens) can be recommended. The carcinogenic hazards associated with airborne particles and surface contamination are the most crucial of the whole spectrum and offer a practical target for control, if not for quantitative evaluation. The only direct quantitative evidence now availabl is from epidemiology in analogous circulstances and there are severe limitations on the comprehensiveness and reliability of such evidence. Some specific targets for control through industrial hygiene practices can be identified. The presence of any strong irritant of the respiratory mucosa, other mucous surfaces, and the skin should be regarded as a danger signal because of possible potentiation of carcinogens and other toxicants.« less
Treff, Nathan R; Scott, Richard T
2013-03-15
Embryonic comprehensive chromosomal euploidy may represent a powerful biomarker to improve the success of IVF. However, there are a number of aneuploidy screening strategies to consider, including different technologic platforms with which to interrogate the embryonic DNA, and different embryonic developmental stages from which DNA can be analyzed. Although there are advantages and disadvantages associated with each strategy, a series of experiments producing evidence of accuracy, safety, clinical predictive value, and clinical efficacy indicate that trophectoderm biopsy and quantitative real-time polymerase chain reaction (qPCR)-based comprehensive chromosome screening (CCS) may represent a useful strategy to improve the success of IVF. This Biomarkers in Reproductive Medicine special issue review summarizes the accumulated experience with the development and clinical application of a 4-hour blastocyst qPCR-based CCS technology. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Cardenas, Carlos E; Mohamed, Abdallah S R; Tao, Randa; Wong, Andrew J R; Awan, Mussadiq J; Kuruvila, Shirly; Aristophanous, Michalis; Gunn, G Brandon; Phan, Jack; Beadle, Beth M; Frank, Steven J; Garden, Adam S; Morrison, William H; Fuller, Clifton D; Rosenthal, David I
2017-07-01
Our department has a long-established comprehensive quality assurance (QA) planning clinic for patients undergoing radiation therapy (RT) for head and neck cancer. Our aim is to assess the impact of a real-time peer review QA process on the quantitative and qualitative radiation therapy plan changes in the era of intensity modulated RT (IMRT). Prospective data for 85 patients undergoing head and neck IMRT who presented at a biweekly QA clinic after simulation and contouring were collected. A standard data collection form was used to document alterations made during this process. The original pre-QA clinical target volumes (CTVs) approved by the treating-attending physicians were saved before QA and compared with post-QA consensus CTVs. Qualitative assessment was done according to predefined criteria. Dice similarity coefficients (DSC) and other volume overlap metrics were calculated for each CTV level and were used for quantitative comparison. Changes are categorized as major, minor, and trivial according to the degree of overlap. Patterns of failure were analyzed and correlated to plan changes. All 85 patients were examined by at least 1 head and neck subspecialist radiation oncologist who was not the treating-attending physician; 80 (94%) were examined by ≥3 faculty members. New clinical findings on physical examination were found in 12 patients (14%) leading to major plan changes. Quantitative DSC analysis revealed significantly better agreement in CTV1 (0.94 ± 0.10) contours than in CTV2 (0.82 ± 0.25) and CTV3 (0.86 ± 0.2) contours (P=.0002 and P=.03, respectively; matched-pair Wilcoxon test). The experience of the treating-attending radiation oncologist significantly affected DSC values when all CTV levels were considered (P=.012; matched-pair Wilcoxon text). After a median follow-up time of 38 months, only 10 patients (12%) had local recurrence, regional recurrence, or both, mostly in central high-dose areas. Comprehensive peer review planning clinic is an essential component of IMRT QA that led to major changes in one-third of the study population. This process ensured safety related to target definition and led to favorable disease control profiles, with no identifiable recurrences attributable to geometric misses or delineation errors. Copyright © 2016 Elsevier Inc. All rights reserved.
A comprehensive review of the SLMTA literature part 2: Measuring success.
Luman, Elizabeth T; Yao, Katy; Nkengasong, John N
2014-01-01
Since its introduction in 2009, the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has been implemented in 617 laboratories in 47 countries. We completed a systematic review of the published literature on SLMTA. The review consists of two companion papers; this article examines quantitative evidence presented in the publications along with a meta-analysis of selected results. We identified 28 published articles with data from SLMTA implementation. The SLMTA programme was evaluated through audits based on a standard checklist, which is divided into 12 sections corresponding to the 12 Quality System Essentials (QSEs). Several basic service delivery indicators reported by programmes were also examined. Results for various components of the programme were reviewed and summarised; a meta-analysis of QSE results grouped by the three stages of the quality cycle was conducted for 126 laboratories in 12 countries. Global programme data show improved quality in SLMTA laboratories in every country, with average improvements on audit scores of 25 percentage points. Meta-analysis identified Improvement Management as the weakest stage, with internal audit (8%) and occurrence management (16%) showing the lowest scores. Studies documented 19% - 95% reductions in turn-around times, 69% - 93% reductions in specimen rejection rates, 76% - 81% increases in clinician satisfaction rates, 67% - 85% improvements in external quality assessment results, 50% - 66% decreases in nonconformities and 67% increases in staff punctuality. The wide array of results reported provides a comprehensive picture of the SLMTA programme overall, suggesting a substantive impact on provision of quality laboratory services and patient care. These comprehensive results establish a solid data-driven foundation for program improvement and further expansion.
Millen, Barbara E; Quatromoni, Paula A; Pencina, Michael; Kimokoti, Ruth; Nam, Byung-H O; Cobain, Sonia; Kozak, Waldemar; Appugliese, Danielle P; Ordovas, Jose; D'Agostino, Ralph B
2005-11-01
To identify the dietary patterns of adult men and examine their relationships with nutrient intake and chronic disease risk over long-term follow-up. Baseline 145-item food frequency questionnaires from 1,666 Framingham Offspring-Spouse cohort men were used to identify comprehensive dietary patterns. Independent 3-day dietary records at baseline and 8 years later provided estimates of subjects' nutrient intake by dietary pattern. Chronic disease risk factor status was compared at baseline and 16-year follow-up across all male dietary patterns. Cluster analysis was applied to food frequency data to identify non-overlapping male dietary patterns. Analysis of covariance and logistic regression were used to compare nutrient intake, summary nutritional risk scores, and chronic disease risk status at baseline and follow-up by male dietary pattern. Five distinct and comprehensive dietary patterns of Framingham Offspring-Spouse men were identified and ordered according to overall nutritional risk: Transition to Heart Healthy, Higher Starch, Average Male, Lower Variety, and Empty Calories. Nutritional risk was high and varied by dietary pattern; key nutrient contrasts were stable over 8-year follow-up. Chronic disease risk also varied by dietary pattern and specific subgroup differences persisted over 16 years, notably rates of overweight/obesity and smoking. Quantitative cluster analysis applied to food frequency questionnaire data identified five distinct, comprehensive, and stable dietary patterns of adult Framingham Offspring-Spouse cohort men. The close associations between the dietary patterns, nutritional risk, and chronic disease profiles of men emphasize the importance of targeted preventive nutrition interventions to promote health in the male population.
Evolutionary Glycomics: Characterization of Milk Oligosaccharides in Primates
Tao, Nannan; Wu, Shuai; Kim, Jaehan; An, Hyun Joo; Hinde, Katie; Power, Michael L.; Gagneux, Pascal; German, J. Bruce; Lebrilla, Carlito B.
2011-01-01
Free oligosaccharides are abundant components of mammalian milk and have primary roles as prebiotic compounds, in immune defense, and in brain development. Mass spectrometry-based technique is applied to profile milk oligosaccharides from apes (chimpanzee, gorilla, and siamang), new world monkeys (golden lion tamarin and common marmoset), and an old world monkey (rhesus). The purpose of this study was to evaluate the patterns of primate milk oligosaccharide composition from a phylogenetic perspective in order to assess the extent to which the compositions of hMOs derives from ancestral, primate patterns as opposed to more recent evolutionary events. Milk oligosaccharides were quantitated by nanoflow liquid chromatography on chip-based devices. The relative abundances of fucosylated and sialylated milk oligosaccharides in primates were also determined. For a systematic and comprehensive study of evolutionary patterns of milk oligosaccharides, cluster analysis of primate milk was performed using the chromatographic profile. In general, the oligosaccharides in primate milk, including humans, are more complex and exhibit greater diversity compared to the ones in non-primate milk. A detailed comparison of the oligosaccharides across evolution revealed non-sequential developmental pattern, i.e. that primate milk oligosaccharides do not necessarily cluster according to the primate phylogeny. This report represents the first comprehensive and quantitative effort to profile and elucidate the structures of free milk oligosaccharides so that they can be related to glycan function in different primates. PMID:21214271
[Development of questionnaire to assess food intake in the University of Antioquia, Colombia].
Monsalve Álvarez, J Ma; González Zapata, L I
2011-01-01
To design a self-reported semi-quantitative questionnaire of frequency of food intake (QFFI) to assess the usual intake of foods and macronutrients of the academic community from the University of Antioquia (UoA). The design of the questionnaire was done by reviewing and analyzing previous studies on food intake and the established linings for the selling of dietary products at the UoA. For selecting the foods and defining the serving size, we applied the criterion of intake pattern and the weight or size of the most frequent serving for each food, according to what has been reported in previous studies. We undertook a descriptive exploratory study with the QFFI including 154 people of the UoA. The analysis included the comprehension, applicability and comprehensiveness of the instrument for the participants, as well as the calculation of the descriptive parameters in the exploratory component. A semi-quantitative QFFI compounded by 144 foods classified in 9 groups with serving sizes known by the population and 9 categories of frequencies of usual intake within the last year. The exploratory study showed differences in the pattern of intake in the population groups analyzed. The designed QFFI was adequate for the study population since it was easily understood and administered, and it allowed for assessing the usual intake in the community of the UoA.
Gündisch, Sibylle; Schott, Christina; Wolff, Claudia; Tran, Kai; Beese, Christian; Viertler, Christian; Zatloukal, Kurt; Becker, Karl-Friedrich
2013-01-01
Precise quantitation of protein biomarkers in clinical tissue specimens is a prerequisite for accurate and effective diagnosis, prognosis, and personalized medicine. Although progress is being made, protein analysis from formalin-fixed and paraffin-embedded tissues is still challenging. In previous reports, we showed that the novel formalin-free tissue preservation technology, the PAXgene Tissue System, allows the extraction of intact and immunoreactive proteins from PAXgene-fixed and paraffin-embedded (PFPE) tissues. In the current study, we focused on the analysis of phosphoproteins and the applicability of two-dimensional gel electrophoresis (2D-PAGE) and enzyme-linked immunosorbent assay (ELISA) to the analysis of a variety of malignant and non-malignant human tissues. Using western blot analysis, we found that phosphoproteins are quantitatively preserved in PFPE tissues, and signal intensities are comparable to that in paired, frozen tissues. Furthermore, proteins extracted from PFPE samples are suitable for 2D-PAGE and can be quantified by ELISA specific for denatured proteins. In summary, the PAXgene Tissue System reliably preserves phosphoproteins in human tissue samples, even after prolonged fixation or stabilization times, and is compatible with methods for protein analysis such as 2D-PAGE and ELISA. We conclude that the PAXgene Tissue System has the potential to serve as a versatile tissue fixative for modern pathology. PMID:23555997
Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li
2014-12-01
Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-01-01
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra, extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. PMID:29589829
Quantitative iTRAQ secretome analysis of Aspergillus niger reveals novel hydrolytic enzymes.
Adav, Sunil S; Li, An A; Manavalan, Arulmani; Punt, Peter; Sze, Siu Kwan
2010-08-06
The natural lifestyle of Aspergillus niger made them more effective secretors of hydrolytic proteins and becomes critical when this species were exploited as hosts for the commercial secretion of heterologous proteins. The protein secretion profile of A. niger and its mutant at different pH was explored using iTRAQ-based quantitative proteomics approach coupled with liquid chromatography-tandem mass spectrometry (LC-MS/MS). This study characterized 102 highly confident unique proteins in the secretome with zero false discovery rate based on decoy strategy. The iTRAQ technique identified and relatively quantified many hydrolyzing enzymes such as cellulases, hemicellulases, glycoside hydrolases, proteases, peroxidases, and protein translocating transporter proteins during fermentation. The enzymes have potential application in lignocellulosic biomass hydrolysis for biofuel production, for example, the cellulolytic and hemicellulolytic enzymes glucan 1,4-alpha-glucosidase, alpha-glucosidase C, endoglucanase, alpha l-arabinofuranosidase, beta-mannosidase, glycosyl hydrolase; proteases such as tripeptidyl-peptidase, aspergillopepsin, and other enzymes including cytochrome c oxidase, cytochrome c oxidase, glucose oxidase were highly expressed in A. niger and its mutant secretion. In addition, specific enzyme production can be stimulated by controlling pH of the culture medium. Our results showed comprehensive unique secretory protein profile of A. niger, its regulation at different pH, and the potential application of iTRAQ-based quantitative proteomics for the microbial secretome analysis.
Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L.; Ilangovan, Govindasamy
2006-01-01
Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO2-independent respiration at higher pO2 ranges, pO2-dependent respiration at low pO2 ranges, and a static equilibrium with no change in pO2 at very low pO2 values. The approach here enables one to comprehensively analyze all of the three zones together—where the progression of O2 diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO2 data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry. PMID:17012319
Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng
2016-01-01
Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance. PMID:27902743
Wang, Zhenchao; Yu, Lu; Jin, Linhong; Wang, Wenli; Zhao, Qi; Ran, Longlu; Li, Xiangyang; Chen, Zhuo; Guo, Rong; Wei, Yongtian; Yang, Zhongcheng; Liu, Enlong; Hu, Deyu; Song, Baoan
2017-02-22
Diseases caused by southern rice black-streaked dwarf virus (SRBSDV) and rice ragged stunt virus (RRSV) considerably decrease grain yield. Therefore, determining rice cultivars with high resistance to SRBSDV and RRSV is necessary. In this study, rice cultivars with high resistance to SRBSDV and RRSV were evaluated through field trials in Shidian and Mangshi county, Yunnan province, China. SYBR Green I-based quantitative real-time polymerase chain reaction (qRT-PCR) analysis was used to quantitatively detect virus gene expression levels in different rice varieties. The following parameters were applied to evaluate rice resistance: acre yield (A.Y.), incidence of infected plants (I.I.P.), virus load (V.L.), disease index (D.I.), and insect quantity (I.Q.) per 100 clusters. Zhongzheyou1 (Z1) and Liangyou2186 (L2186) were considered the most suitable varieties with integrated higher A.Y., lower I.I.P., V.L., D.I. and I.Q. In order to investigate the mechanism of rice resistance, comparative label-free shotgun liquid chromatography tandem-mass spectrometry (LC-MS/MS) proteomic approaches were applied to comprehensively describe the proteomics of rice varieties' SRBSDV tolerance. Systemic acquired resistance (SAR)-related proteins in Z1 and L2186 may result in the superior resistance of these varieties compared with Fengyouxiangzhan (FYXZ).
Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng
2016-01-01
Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance.
Thomas, Krista; Wechsler, Dominik; Chen, Yi-Min; Crain, Sheila; Quilliam, Michael A
2016-09-01
The implementation of instrumental analytical methods such as LC-MS for routine monitoring of toxins requires the availability of accurate calibration standards. This is a challenge because many toxins are rare, expensive, dangerous to handle, and/or unstable, and simple gravimetric procedures are not reliable for establishing accurate concentrations in solution. NMR has served as one method of qualitative and quantitative characterization of toxin calibration solution Certified Reference Materials (CRMs). LC with chemiluminescence N detection (LC-CLND) was selected as a complementary method for comprehensive characterization of CRMs because it provides a molar response to N. Here we report on our investigation of LC-CLND as a method suitable for quantitative analysis of nitrogenous toxins. It was demonstrated that a wide range of toxins could be analyzed quantitatively by LC-CLND. Furthermore, equimolar responses among diverse structures were established and it was shown that a single high-purity standard such as caffeine could be used for instrument calibration. The limit of detection was approximately 0.6 ng N. Measurement of several of Canada's National Research Council toxin CRMs with caffeine as the calibrant showed precision averaging 2% RSD and accuracy ranging from 97 to 102%. Application of LC-CLND to the production of calibration solution CRMs and the establishment of traceability of measurement results are presented.
An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe
2015-12-26
Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less
Wieser, Stefan; Axmann, Markus; Schütz, Gerhard J.
2008-01-01
We propose here an approach for the analysis of single-molecule trajectories which is based on a comprehensive comparison of an experimental data set with multiple Monte Carlo simulations of the diffusion process. It allows quantitative data analysis, particularly whenever analytical treatment of a model is infeasible. Simulations are performed on a discrete parameter space and compared with the experimental results by a nonparametric statistical test. The method provides a matrix of p-values that assess the probability for having observed the experimental data at each setting of the model parameters. We show the testing approach for three typical situations observed in the cellular plasma membrane: i), free Brownian motion of the tracer, ii), hop diffusion of the tracer in a periodic meshwork of squares, and iii), transient binding of the tracer to slowly diffusing structures. By plotting the p-value as a function of the model parameters, one can easily identify the most consistent parameter settings but also recover mutual dependencies and ambiguities which are difficult to determine by standard fitting routines. Finally, we used the test to reanalyze previous data obtained on the diffusion of the glycosylphosphatidylinositol-protein CD59 in the plasma membrane of the human T24 cell line. PMID:18805933
Analysis of the clonal repertoire of gene-corrected cells in gene therapy.
Paruzynski, Anna; Glimm, Hanno; Schmidt, Manfred; Kalle, Christof von
2012-01-01
Gene therapy-based clinical phase I/II studies using integrating retroviral vectors could successfully treat different monogenetic inherited diseases. However, with increased efficiency of this therapy, severe side effects occurred in various gene therapy trials. In all cases, integration of the vector close to or within a proto-oncogene contributed substantially to the development of the malignancies. Thus, the in-depth analysis of integration site patterns is of high importance to uncover potential clonal outgrowth and to assess the safety of gene transfer vectors and gene therapy protocols. The standard and nonrestrictive linear amplification-mediated PCR (nrLAM-PCR) in combination with high-throughput sequencing exhibits technologies that allow to comprehensively analyze the clonal repertoire of gene-corrected cells and to assess the safety of the used vector system at an early stage on the molecular level. It enables clarifying the biological consequences of the vector system on the fate of the transduced cell. Furthermore, the downstream performance of real-time PCR allows a quantitative estimation of the clonality of individual cells and their clonal progeny. Here, we present a guideline that should allow researchers to perform comprehensive integration site analysis in preclinical and clinical studies. Copyright © 2012 Elsevier Inc. All rights reserved.
A comprehensive comparison of tools for differential ChIP-seq analysis
Steinhauser, Sebastian; Kurzawa, Nils; Eils, Roland
2016-01-01
ChIP-seq has become a widely adopted genomic assay in recent years to determine binding sites for transcription factors or enrichments for specific histone modifications. Beside detection of enriched or bound regions, an important question is to determine differences between conditions. While this is a common analysis for gene expression, for which a large number of computational approaches have been validated, the same question for ChIP-seq is particularly challenging owing to the complexity of ChIP-seq data in terms of noisiness and variability. Many different tools have been developed and published in recent years. However, a comprehensive comparison and review of these tools is still missing. Here, we have reviewed 14 tools, which have been developed to determine differential enrichment between two conditions. They differ in their algorithmic setups, and also in the range of applicability. Hence, we have benchmarked these tools on real data sets for transcription factors and histone modifications, as well as on simulated data sets to quantitatively evaluate their performance. Overall, there is a great variety in the type of signal detected by these tools with a surprisingly low level of agreement. Depending on the type of analysis performed, the choice of method will crucially impact the outcome. PMID:26764273
Optimizing separations in online comprehensive two-dimensional liquid chromatography.
Pirok, Bob W J; Gargano, Andrea F G; Schoenmakers, Peter J
2018-01-01
Online comprehensive two-dimensional liquid chromatography has become an attractive option for the analysis of complex nonvolatile samples found in various fields (e.g. environmental studies, food, life, and polymer sciences). Two-dimensional liquid chromatography complements the highly popular hyphenated systems that combine liquid chromatography with mass spectrometry. Two-dimensional liquid chromatography is also applied to the analysis of samples that are not compatible with mass spectrometry (e.g. high-molecular-weight polymers), providing important information on the distribution of the sample components along chemical dimensions (molecular weight, charge, lipophilicity, stereochemistry, etc.). Also, in comparison with conventional one-dimensional liquid chromatography, two-dimensional liquid chromatography provides a greater separation power (peak capacity). Because of the additional selectivity and higher peak capacity, the combination of two-dimensional liquid chromatography with mass spectrometry allows for simpler mixtures of compounds to be introduced in the ion source at any given time, improving quantitative analysis by reducing matrix effects. In this review, we summarize the rationale and principles of two-dimensional liquid chromatography experiments, describe advantages and disadvantages of combining different selectivities and discuss strategies to improve the quality of two-dimensional liquid chromatography separations. © 2017 The Authors. Journal of Separation Science published by WILEY-VCH Verlag GmbH & Co. KGaA.
Sullards, M. Cameron; Liu, Ying; Chen, Yanfeng; Merrill, Alfred H.
2011-01-01
Sphingolipids are a highly diverse category of molecules that serve not only as components of biological structures but also as regulators of numerous cell functions. Because so many of the structural features of sphingolipids give rise to their biological activity, there is a need for comprehensive or “sphingolipidomic” methods for identification and quantitation of as many individual subspecies as possible. This review defines sphingolipids as a class, briefly discusses classical methods for their analysis, and focuses primarily on liquid chromatography tandem mass spectrometry (LC-MS/MS) and tissue imaging mass spectrometry (TIMS). Recently, a set of evolving and expanding methods have been developed and rigorously validated for the extraction, identification, separation, and quantitation of sphingolipids by LC-MS/MS. Quantitation of these biomolecules is made possible via the use of an internal standard cocktail. The compounds that can be readily analyzed are free long-chain (sphingoid) bases, sphingoid base 1-phosphates, and more complex species such as ceramides, ceramide 1-phosphates, sphingomyelins, mono- and di-hexosylceramides sulfatides, and novel compounds such as the 1-deoxy- and 1-(deoxymethyl)-sphingoid bases and their N-acyl-derivatives. These methods can be altered slightly to separate and quantitate isomeric species such as glucosyl/galactosylceramide. Because these techniques require the extraction of sphingolipids from their native environment, any information regarding their localization in histological slices is lost. Therefore, this review also describes methods for TIMS. This technique has been shown to be a powerful tool to determine the localization of individual molecular species of sphingolipids directly from tissue slices. PMID:21749933
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F
2018-05-26
Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.
Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.
Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz
2017-03-01
Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.
2014-01-01
Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516
ERIC Educational Resources Information Center
Boyce, Jared; Bowers, Alex J.
2018-01-01
Purpose: Instructional leadership has been an active area of educational administration research over the past 30 years. However, there has been significant divergence in how instructional leadership has been conceptualized over time. The purpose of this paper is to present a comprehensive review of 25 years of quantitative instructional…
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M
2006-10-13
Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.
Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis.
Paradies, Yin; Priest, Naomi; Ben, Jehonathan; Truong, Mandy; Gupta, Arpana; Pieterse, Alex; Kelaher, Margaret; Gee, Gilbert
2013-09-23
Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research will provide a rigorous and reliable evidence base for future research, policy and practice, including information on the extent of available evidence for a range of racial/ethnic minority groups.
Robles, Estuardo
2017-09-01
In no vertebrate species do we possess an accurate, comprehensive tally of neuron types in the brain. This is in no small part due to the vast diversity of neuronal types that comprise complex vertebrate nervous systems. A fundamental goal of neuroscience is to construct comprehensive catalogs of cell types defined by structure, connectivity, and physiological response properties. This type of information will be invaluable for generating models of how assemblies of neurons encode and distribute sensory information and correspondingly alter behavior. This review summarizes recent efforts in the larval zebrafish to construct sensory projectomes, comprehensive analyses of axonal morphologies in sensory axon tracts. Focusing on the olfactory and optic tract, these studies revealed principles of sensory information processing in the olfactory and visual systems that could not have been directly quantified by other methods. In essence, these studies reconstructed the optic and olfactory tract in a virtual manner, providing insights into patterns of neuronal growth that underlie the formation of sensory axon tracts. Quantitative analysis of neuronal diversity revealed organizing principles that determine information flow through sensory systems in the zebrafish that are likely to be conserved across vertebrate species. The generation of comprehensive cell type classifications based on structural, physiological, and molecular features will lead to testable hypotheses on the functional role of individual sensory neuron subtypes in controlling specific sensory-evoked behaviors.
García-Fernández, Lisset; Benites, Carlos; Huamán, Byelca
2017-05-25
Identify the programmatic barriers that hinder access to comprehensive care of patients with tuberculosis and human immunodeficiency virus (TB/HIV) coinfection. This is a mixed-method study. Qualitative research was conducted via in-depth interviews with key actors and the quantitative component involved cross-sectional descriptive analysis of programmatic data from 2010-2015 on tuberculosis and HIV programs at health facilities in the cities of Lima and Iquitos. Twenty-two key actors in seven establishments were interviewed. The identified barriers were: little or no coordination between tuberculosis and HIV teams, separate management of tuberculosis and HIV cases at different levels of care, insufficient financing, limited or poorly trained human resources, and lack of an integrated information system. It was found that HIV screening in TB patients increased (from 18.8% in 2011 to 95.2% in 2015), isoniazid coverage of HIV patients declined (from 62% to 9%), and the proportion of deaths among TB/HIV coinfection cases averaged 20%. There is poor coordination between HIV and TB health strategies. Management of TB/HIV coinfection is fragmented into different levels of care, which has an impact on comprehensive patient care. As a result of this research, a technical document was prepared to establish joint procedures that should be implemented to improve comprehensive care of TB/HIV coinfection.
Cordero, Chiara; Rubiolo, Patrizia; Reichenbach, Stephen E; Carretta, Andrea; Cobelli, Luigi; Giardina, Matthew; Bicchi, Carlo
2017-01-13
The possibility to transfer methods from thermal to differential-flow modulated comprehensive two-dimensional gas chromatographic (GC×GC) platforms is of high interest to improve GC×GC flexibility and increase the compatibility of results from different platforms. The principles of method translation are here applied to an original method, developed for a loop-type thermal modulated GC×GC-MS/FID system, suitable for quali-quantitative screening of suspected fragrance allergens. The analysis conditions were translated to a reverse-injection differential flow modulated platform (GC×2GC-MS/FID) with a dual-parallel secondary column and dual detection. The experimental results, for a model mixture of suspected volatile allergens and for raw fragrance mixtures of different composition, confirmed the feasibility of translating methods by preserving 1 D elution order, as well as the relative alignment of resulting 2D peak patterns. A correct translation produced several benefits including an effective transfer of metadata (compound names, MS fragmentation pattern, response factors) by automatic template transformation and matching from the original/reference method to its translated counterpart. The correct translation provided: (a) 2D pattern repeatability, (b) MS fragmentation pattern reliability for identity confirmation, and (c) comparable response factors and quantitation accuracy within a concentration range of three orders of magnitude. The adoption of a narrow bore (i.e. 0.1mm d c ) first-dimension column to operate under close-to-optimal conditions with the differential-flow modulation GC×GC platform was also advantageous in halving the total analysis under the translated conditions. Copyright © 2016 Elsevier B.V. All rights reserved.
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Savareear, Benjamin; Lizak, Radoslaw; Brokl, Michał; Wright, Chris; Liu, Chuan; Focant, Jean-Francois
2017-10-20
A method involving headspace solid-phase microextraction (HS-SPME) and comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC-TOFMS) was developed and optimised to elucidate the volatile composition of the particulate phase fraction of aerosol produced by tobacco heating products (THPs). Three SPME fiber types were studied in terms of extraction capacity and precision measurements. Divinylbenzene polydimethylsiloxane appeared as the most efficient coating for these measurements. A central composite design of experiment was utilised for the optimization of the extraction conditions. Qualitative and semi-quantitative analysis of the headspace above THP aerosol condensate was carried out using optimised extraction conditions. Semi-quantitative analyses of detected constituents were performed by assuming that their relative response factors to the closest internal standard ( i t R ) were equal to 1. Using deconvoluted mass spectral data (library similarity and reverse match >750) and linear retention indices (match window of ±15 index units), 205 peaks were assigned to individual compounds, 82 of which (including 43 substances previously reported to be present in tobacco) have not been reported previously in tobacco aerosol. The major volatile fraction of the headspace contained ketones, alcohols, aldehydes, alicyclic hydrocarbons alkenes, and alkanes. The method was further applied to compare the volatiles from the particulate phase of aerosol composition of THP with that of reference cigarette smoke and showed that the THP produced a less complex chemical mixture. This new method showed good efficiency and precision for the peak areas and peak numbers from the volatile fraction of aerosol particulate phase for both THP and reference cigarettes. Copyright © 2017 Elsevier B.V. All rights reserved.
Bohrn, Isabel C; Altmann, Ulrike; Jacobs, Arthur M
2012-09-01
A quantitative, coordinate-based meta-analysis combined data from 354 participants across 22 fMRI studies and one positron emission tomography (PET) study to identify the differences in neural correlates of figurative and literal language processing, and to investigate the role of the right hemisphere (RH) in figurative language processing. Studies that reported peak activations in standard space contrasting figurative vs. literal language processing at whole brain level in healthy adults were included. The left and right IFG, large parts of the left temporal lobe, the bilateral medial frontal gyri (medFG) and an area around the left amygdala emerged for figurative language processing across studies. Conditions requiring exclusively literal language processing did not activate any selective regions in most of the cases, but if so they activated the cuneus/precuneus, right MFG and the right IPL. No general RH advantage for metaphor processing could be found. On the contrary, significant clusters of activation for metaphor conditions were mostly lateralized to the left hemisphere (LH). Subgroup comparisons between experiments on metaphors, idioms, and irony/sarcasm revealed shared activations in left frontotemporal regions for idiom and metaphor processing. Irony/sarcasm processing was correlated with activations in midline structures such as the medFG, ACC and cuneus/precuneus. To test the graded salience hypothesis (GSH, Giora, 1997), novel metaphors were contrasted against conventional metaphors. In line with the GSH, RH involvement was found for novel metaphors only. Here we show that more analytic, semantic processes are involved in metaphor comprehension, whereas irony/sarcasm comprehension involves theory of mind processes. Copyright © 2012 Elsevier Ltd. All rights reserved.
A Quantitative Approach to Determine Analogous Areas Using Environmental Parameters
2008-03-01
degrees Celsius COADS Comprehensive Ocean - Atmosphere Data Set CONUS Continental United States CTD Conductivity/Temperature/Depth probe...consolidation of a marine database. Out of this effort came the Comprehensive Ocean - Atmosphere Data Set (COADS). The original 17 data sets were...National Oceanic and Atmospheric Administration (NOAA) has compiled a database of total sediment thickness of the global oceans and seas. These data are
Properties of young massive clusters obtained with different massive-star evolutionary models
NASA Astrophysics Data System (ADS)
Wofford, Aida; Charlot, Stéphane
We undertake a comprehensive comparative test of seven widely-used spectral synthesis models using multi-band HST photometry of a sample of eight YMCs in two galaxies. We provide a first quantitative estimate of the accuracies and uncertainties of new models, show the good progress of models in fitting high-quality observations, and highlight the need of further comprehensive comparative tests.
Nizio, Katie D; Harynuk, James J
2012-08-24
Alkyl phosphate based gellants used as viscosity builders for fracturing fluids used in the process of hydraulic fracturing have been implicated in numerous refinery-fouling incidents in North America. In response, industry developed an inductively coupled plasma optical emission spectroscopy (ICP-OES) based method for the analysis of total volatile phosphorus in distillate fractions of crude oil; however, this method is plagued by poor precision and a high limit of detection (0.5±1μg phosphorus mL(-1)). Furthermore this method cannot provide speciation information, which is critical for developing an understanding of the challenge of alkyl phosphates at a molecular level. An approach using comprehensive two-dimensional gas chromatography with nitrogen phosphorus detection (GC×GC-NPD) and post-column Deans switching is presented. This method provides qualitative and quantitative profiles of alkyl phosphates in industrial petroleum samples with increased precision and at levels comparable to or below those achievable by ICP-OES. A recovery study in a fracturing fluid sample and a profiling study of alkyl phosphates in four recovered fracturing fluid/crude oil mixtures (flowback) are also presented. Copyright © 2012 Elsevier B.V. All rights reserved.
Quantifying the Variability in Species' Vulnerability to Ocean Acidification
NASA Astrophysics Data System (ADS)
Kroeker, K. J.; Kordas, R. L.; Crim, R.; Gattuso, J.; Hendriks, I.; Singh, G. G.
2012-12-01
Ocean acidification represents a threat to marine species and ecosystems worldwide. As such, understanding the potential ecological impacts of acidification is a high priority for science, management, and policy. As research on the biological impacts of ocean acidification continues to expand at an exponential rate, a comprehensive understanding of the generalities and/or variability in organisms' responses and the corresponding levels of certainty of these potential responses is essential. Meta-analysis is a quantitative technique for summarizing the results of primary research studies and provides a transparent method to examine the generalities and/or variability in scientific results across numerous studies. Here, we perform the most comprehensive meta-analysis to date by synthesizing the results of 228 studies examining the biological impacts of ocean acidification. Our results reveal decreased survival, calcification, growth, reproduction and development in response to acidification across a broad range of marine organisms, as well as significant trait-mediated variation among taxonomic groups and enhanced sensitivity among early life history stages. In addition, our results reveal a pronounced sensitivity of molluscs to acidification, especially among the larval stages, and enhanced vulnerability to acidification with concurrent exposure to increased seawater temperatures across a diversity of organisms.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*
Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-01-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Cinzia, Raso; Carlo, Cosentino; Marco, Gaspari; Natalia, Malara; Xuemei, Han; Daniel, McClatchy; Kyu, Park Sung; Maria, Renne; Nuria, Vadalà; Ubaldo, Prati; Giovanni, Cuda; Vincenzo, Mollace; Francesco, Amato; Yates, John R.
2012-01-01
Cancer is currently considered as the end point of numerous genomic and epigenomic mutations and as the result of the interaction of transformed cells within the stromal microenvironment. The present work focuses on breast cancer, one of the most common malignancies affecting the female population in industrialized countries. In this study we perform a proteomic analysis of bioptic samples from human breast cancer, namely interstitial fluids and primary cells, normal vs disease tissues, using Tandem mass Tags (TmT) quantitative mass spectrometry combined with the MudPIT technique. To the best of our knowledge this work, with over 1700 proteins identified, represents the most comprehensive characterization of the breast cancer interstitial fluid proteome to date. Network analysis was used to identify functionally active networks in the breast cancer associated samples. From the list of differentially expressed genes we have retrieved the associated functional interaction networks. Many different signaling pathways were found activated, strongly linked to invasion, metastasis development, proliferation and with a significant cross-talking rate. This pilot study presents evidence that the proposed quantitative proteomic approach can be applied to discriminate between normal and tumoral samples and for the discovery of yet unknown carcinogenesis mechanisms and therapeutic strategies. PMID:22563702
Panizza, Elena; Branca, Rui M M; Oliviusson, Peter; Orre, Lukas M; Lehtiö, Janne
2017-07-03
Protein phosphorylation is involved in the regulation of most eukaryotic cells functions and mass spectrometry-based analysis has made major contributions to our understanding of this regulation. However, low abundance of phosphorylated species presents a major challenge in achieving comprehensive phosphoproteome coverage and robust quantification. In this study, we developed a workflow employing titanium dioxide phospho-enrichment coupled with isobaric labeling by Tandem Mass Tags (TMT) and high-resolution isoelectric focusing (HiRIEF) fractionation to perform in-depth quantitative phosphoproteomics starting with a low sample quantity. To benchmark the workflow, we analyzed HeLa cells upon pervanadate treatment or cell cycle arrest in mitosis. Analyzing 300 µg of peptides per sample, we identified 22,712 phosphorylation sites, of which 19,075 were localized with high confidence and 1,203 are phosphorylated tyrosine residues, representing 6.3% of all detected phospho-sites. HiRIEF fractions with the most acidic isoelectric points are enriched in multiply phosphorylated peptides, which represent 18% of all the phospho-peptides detected in the pH range 2.5-3.7. Cross-referencing with the PhosphoSitePlus database reveals 1,264 phosphorylation sites that have not been previously reported and kinase association analysis suggests that a subset of these may be functional during the mitotic phase.
[Content determination of twelve major components in Tibetan medicine Zuozhu Daxi by UPLC].
Qu, Yan; Li, Jin-hua; Zhang, Chen; Li, Chun-xue; Dong, Hong-jiao; Wang, Chang-sheng; Zeng, Rui; Chen, Xiao-hu
2015-05-01
A quantitative analytical method of ultra-high performance liquid chromatography (UPLC) was developed for simultaneously determining twelve components in Tibetan medicine Zuozhu Daxi. SIMPCA 12.0 software was used a principal component analysis PCA) and partial small squares analysis (PLSD-DA) on the twelve components in 10 batches from four pharmaceutical factories. Acquity UPLC BEH C15 column (2.1 mm x 100 mm, 1.7 µm) was adopted at the column temperature of 35 °C and eluted with acetonitrile (A) -0.05% phosphate acid solution (B) as the mobile phase with a flow rate of 0. 3 mL · min(-1). The injection volume was 1 µL. The detection wavelengths were set at 210 nm for alantolactone, isoalantolactone and oleanolic; 260 nm for trychnine and brucine; 288 nm for protopine; 306 nm for protopine, resveratrol and piperine; 370 nm for quercetin and isorhamnetin. The results showed a good separation among index components, with a good linearity relationship (R2 = 0.999 6) within the selected concentration range. The average sample recovery rates ranged between 99.44%-101.8%, with RSD between 0.37%-1.7%, indicating the method is rapid and accurate with a good repeatability and stability. The PCA and PLSD-DA analysis on the sample determination results revealed a great difference among samples from different pharmaceutical factories. The twelve components included in this study contributed significantly to the quantitative determination of intrinsic quality of Zuozhu Daxi. The UPLC established for to the quantitative determination of the twelve components can provide scientific basis for the comprehensive quality evaluation of Zuozhu Daxi.
Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification.
Wang, Shouyi; Bowen, Stephen R; Chaovalitwongse, W Art; Sandison, George A; Grabowski, Thomas J; Kinahan, Paul E
2014-02-21
The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUV(peak)) over lesions of interest. Relative differences in SUV(peak) between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUV(peak) values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.
Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification
NASA Astrophysics Data System (ADS)
Wang, Shouyi; Bowen, Stephen R.; Chaovalitwongse, W. Art; Sandison, George A.; Grabowski, Thomas J.; Kinahan, Paul E.
2014-02-01
The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUVpeak) over lesions of interest. Relative differences in SUVpeak between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUVpeak values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.
Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.
van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat
2010-12-24
The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM
2006-01-01
Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197
Development and application of a DNA microarray-based yeast two-hybrid system
Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.
2013-01-01
The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563
Quantitative profiling of sphingolipids in wild Cordyceps and its mycelia by using UHPLC-MS
Mi, Jia-Ning; Wang, Jing-Rong; Jiang, Zhi-Hong
2016-01-01
In the present study, 101 sphingolipids in wild Cordyceps and its five mycelia were quantitatively profiled by using a fully validated UHPLC-MS method. The results revealed that a general rank order for the abundance of different classes of sphingolipids in wild Cordyceps and its mycelia is sphingoid bases/ceramides > phosphosphingolipids > glycosphingolipids. However, remarkable sphingolipid differences between wild Cordyceps and its mycelia were observed. One is that sphingoid base is the dominant sphingolipid in wild Cordyceps, whereas ceramide is the major sphingolipid in mycelia. Another difference is that the abundance of sphingomyelins in wild Cordyceps is almost 10-folds higher than those in most mycelia. The third one is that mycelia contain more inositol phosphorylceramides and glycosphingolipids than wild Cordyceps. Multivariate analysis was further employed to visualize the difference among wild Cordyceps and different mycelia, leading to the identification of respective sphingolipids as potential chemical markers for the differentiation of wild Cordyceps and its related mycelia. This study represents the first report on the quantitative profiling of sphingolipids in wild Cordyceps and its related mycelia, which provided comprehensive chemical evidence for the quality control and rational utilization of wild Cordyceps and its mycelia. PMID:26868933
Systems microscopy: an emerging strategy for the life sciences.
Lock, John G; Strömblad, Staffan
2010-05-01
Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Bilen, Kadir; Ercan, Orhan
2016-01-01
The current study aimed to identify the views of students from the Faculty of Education and the Faculty of Theology from the University of Kahramanmaras Sutcu Imam regarding their comprehension and acceptance of the theory of evolution. A survey model was used involving a quantitative research design. The working group of the study was composed of…
ERIC Educational Resources Information Center
Saltiel, Henry
2011-01-01
A quasi-experiment using quantitative methods was conducted to examine the effects on academic student outcomes when a cohort of employed low-SES community college commuter students (the treatment group, N=198) participated in a comprehensive support and access intervention program, compared with similar students (the matched comparison group,…
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun
2015-10-02
Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.
Ma, Kevin C; Fernandez, James R; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S; Liu, Brent J
2015-12-01
MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ma, Kevin C.; Fernandez, James R.; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S.; Liu, Brent J.
2016-01-01
Purpose MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. Method An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. Summary The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. PMID:26564667
Landi, Nicole; Frost, Stephen J.; Menc, W. Einar; Sandak, Rebecca; Pugh, Kenneth R.
2012-01-01
For accurate reading comprehension, readers must first learn to map letters to their corresponding speech sounds and meaning and then they must string the meanings of many words together to form a representation of the text. Furthermore, readers must master the complexities involved in parsing the relevant syntactic and pragmatic information necessary for accurate interpretation. Failure in this process can occur at multiple levels and cognitive neuroscience has been helpful in identifying the underlying causes of success and failure in reading single words and in reading comprehension. In general, neurobiological studies of skilled reading comprehension indicate a highly overlapping language circuit for single word reading, reading comprehension and listening comprehension with largely quantitative differences in a number of reading and language related areas. This paper reviews relevant research from studies employing neuroimaging techniques to study reading with a focus on the relationship between reading skill, single word reading, and text comprehension. PMID:23662034
Haghi, Ghasem; Arshi, Rohollah; Safaei, Alireza
2008-02-27
A high-performance liquid chromatography (HPLC) method for the qualitative and quantitative analysis of allantoin in silk and seed of Zea mays has been developed. Allantoin separation in crude extract was achieved using a C 18 column and phosphate buffer solution (pH 3.0) as a mobile phase at ambient temperature at a flow rate of 1.0 mL/min and detected at 210 nm. The results showed that the amount of allantoin in samples was between 14 and 271 mg/100 g of dry plant material. A comprehensive validation of the method including sensitivity, linearity, repeatability, and recovery was conducted. The calibration curve was linear over the range of 0.2-200 microg/mL with a correlation coefficient of r2>0.999. Limit of detection (LOD, S/N=3) and limit of quantification (LOQ) values of the allantoin were 0.05 and 0.2 microg/mL (1.0 and 4.0 ng) respectively. The relative standard deviation (RSD) value of the repeatability was reported within 1.2%. The average recovery of allantoin added to samples was 100.6% with RSD of 1.5%.
Quantitative and Qualitative Changes in V-J α Rearrangements During Mouse Thymocytes Differentiation
Pasqual, Nicolas; Gallagher, Maighréad; Aude-Garcia, Catherine; Loiodice, Mélanie; Thuderoz, Florence; Demongeot, Jacques; Ceredig, Rod; Marche, Patrice Noël; Jouvin-Marche, Evelyne
2002-01-01
Knowledge of the complete nucleotide sequence of the mouse TCRAD locus allows an accurate determination V-J rearrangement status. Using multiplex genomic PCR assays and real time PCR analysis, we report a comprehensive and systematic analysis of the V-J recombination of TCR α chain in normal mouse thymocytes during development. These respective qualitative and quantitative approaches give rise to four major points describing the control of gene rearrangements. (a) The V-J recombination pattern is not random during ontogeny and generates a limited TCR α repertoire; (b) V-J rearrangement control is intrinsic to the thymus; (c) each V gene rearranges to a set of contiguous J segments with a gaussian-like frequency; (d) there are more rearrangements involving V genes at the 3′ side than 5′ end of V region. Taken together, this reflects a preferential association of V and J gene segments according to their respective positions in the locus, indicating that accessibility of both V and J regions is coordinately regulated, but in different ways. These results provide a new insight into TCR α repertoire size and suggest a scenario for V usage during differentiation. PMID:12417627
A Local Vision on Soil Hydrology (John Dalton Medal Lecture)
NASA Astrophysics Data System (ADS)
Roth, K.
2012-04-01
After shortly looking back to some research trails of the past decades, and touching on the role of soils in our environmental machinery, a vision on the future of soil hydrology is offered. It is local in the sense of being based on limited experience as well as in the sense of focussing on local spatial scales, from 1 m to 1 km. Cornerstones of this vision are (i) rapid developments of quantitative observation technology, illustrated with the example of ground-penetrating radar (GPR), and (ii) the availability of ever more powerful compute facilities which allow to simulate increasingly complicated model representations in unprecedented detail. Together, they open a powerful and flexible approach to the quantitative understanding of soil hydrology where two lines are fitted: (i) potentially diverse measurements of the system of interest and their analysis and (ii) a comprehensive model representation, including architecture, material properties, forcings, and potentially unknown aspects, together with the same analysis as for (i). This approach pushes traditional inversion to operate on analyses, not on the underlying state variables, and to become flexible with respect to architecture and unknown aspects. The approach will be demonstrated for simple situations at test sites.
A comprehensive review of the SLMTA literature part 2: Measuring success
Yao, Katy; Nkengasong, John N.
2014-01-01
Background Since its introduction in 2009, the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has been implemented in 617 laboratories in 47 countries. Objective We completed a systematic review of the published literature on SLMTA. The review consists of two companion papers; this article examines quantitative evidence presented in the publications along with a meta-analysis of selected results. Methods We identified 28 published articles with data from SLMTA implementation. The SLMTA programme was evaluated through audits based on a standard checklist, which is divided into 12 sections corresponding to the 12 Quality System Essentials (QSEs). Several basic service delivery indicators reported by programmes were also examined. Results for various components of the programme were reviewed and summarised; a meta-analysis of QSE results grouped by the three stages of the quality cycle was conducted for 126 laboratories in 12 countries. Results Global programme data show improved quality in SLMTA laboratories in every country, with average improvements on audit scores of 25 percentage points. Meta-analysis identified Improvement Management as the weakest stage, with internal audit (8%) and occurrence management (16%) showing the lowest scores. Studies documented 19% – 95% reductions in turn-around times, 69% – 93% reductions in specimen rejection rates, 76% – 81% increases in clinician satisfaction rates, 67% – 85% improvements in external quality assessment results, 50% – 66% decreases in nonconformities and 67% increases in staff punctuality. Conclusions The wide array of results reported provides a comprehensive picture of the SLMTA programme overall, suggesting a substantive impact on provision of quality laboratory services and patient care. These comprehensive results establish a solid data-driven foundation for program improvement and further expansion. PMID:29043201
Methods, Tools and Current Perspectives in Proteogenomics *
Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.
2017-01-01
With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751
TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.
Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D
2018-05-08
Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gentry, T.; Schadt, C.; Zhou, J.
Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. Thismore » review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.« less
Evaluating comprehensiveness in children's healthcare.
Diniz, Suênia Gonçalves de Medeiros; Damasceno, Simone Soares; Coutinho, Simone Elizabeth Duarte; Toso, Beatriz Rosana Gonçalves de Oliveira; Collet, Neusa
2016-12-15
To evaluate the presence and extent of comprehensiveness in children's healthcare in the context of the Family Health Strategy. Evaluative, quantitative, cross-sectional study conducted with 344 family members of children at the Family Health Units of João Pessoa, PB, Brazil. Data were collected using the PCATool Brazil - child version and analysed according to descriptive and exploratory statistics. The attribute of comprehensiveness did not obtain satisfactory scores in the two evaluated dimensions, namely "available services" and "provided services". The low scores reveal that the attribute comprehensiveness is not employed as expected in a primary care unit and points to the issues that must be altered. It was concluded that the services should be restructured to ensure cross-sector performance in the provision of child care. It is also important to improve the relations between professionals and users to promote comprehensive and effective care.
Qiao, Xue; He, Wen-ni; Xiang, Cheng; Han, Jian; Wu, Li-jun; Guo, De-an; Ye, Min
2011-01-01
Spirodela polyrrhiza (L.) Schleid. is a traditional Chinese herbal medicine for the treatment of influenza. Despite its wide use in Chinese medicine, no report on quality control of this herb is available so far. To establish qualitative and quantitative analytical methods by high-performance liquid chromatography (HPLC) coupled with mass spectrometry (MS) for the quality control of S. polyrrhiza. The methanol extract of S. polyrrhiza was analysed by HPLC/ESI-MS(n). Flavonoids were identified by comparing with reference standards or according to their MS(n) (n = 2-4) fragmentation behaviours. Based on LC/MS data, a standardised HPLC fingerprint was established by analysing 15 batches of commercial herbal samples. Furthermore, quantitative analysis was conducted by determining five major flavonoids, namely luteolin 8-C-glucoside, apigenin 8-C-glucoside, luteolin 7-O-glucoside, apigenin 7-O-glucoside and luteolin. A total of 18 flavonoids were identified by LC/MS, and 14 of them were reported from this herb for the first time. The HPLC fingerprints contained 10 common peaks, and could differentiate good quality batches from counterfeits. The total contents of five major flavonoids in S. polyrrhiza varied significantly from 4.28 to 19.87 mg/g. Qualitative LC/MS and quantitative HPLC analytical methods were established for the comprehensive quality control of S. polyrrhiza. Copyright © 2011 John Wiley & Sons, Ltd.
Hempel, Kristina; Herbst, Florian-Alexander; Moche, Martin; Hecker, Michael; Becher, Dörte
2011-04-01
Staphylococcus aureus is capable of colonizing and infecting humans by its arsenal of surface-exposed and secreted proteins. Iron-limited conditions in mammalian body fluids serve as a major environmental signal to bacteria to express virulence determinants. Here we present a comprehensive, gel-free, and GeLC-MS/MS-based quantitative proteome profiling of S. aureus under this infection-relevant situation. (14)N(15)N metabolic labeling and three complementing approaches were combined for relative quantitative analyses of surface-associated proteins. The surface-exposed and secreted proteome profiling approaches comprise trypsin shaving, biotinylation, and precipitation of the supernatant. By analysis of the outer subproteomic and cytoplasmic protein fraction, 1210 proteins could be identified including 221 surface-associated proteins. Thus, access was enabled to 70% of the predicted cell wall-associated proteins, 80% of the predicted sortase substrates, two/thirds of lipoproteins and more than 50% of secreted and cytoplasmic proteins. For iron-deficiency, 158 surface-associated proteins were quantified. Twenty-nine proteins were found in altered amounts showing particularly surface-exposed proteins strongly induced, such as the iron-regulated surface determinant proteins IsdA, IsdB, IsdC and IsdD as well as lipid-anchored iron compound-binding proteins. The work presents a crucial subject for understanding S. aureus pathophysiology by the use of methods that allow quantitative surface proteome profiling.
2015-01-01
Although both rhizome and root of Panax notoginseng are officially utilized as notoginseng in “Chinese Pharmacopoeia”, individual parts of the root were differently used in practice. To provide chemical evidence for the differentiated usage, quantitative comparison and metabolite profiling of different portions derived from the whole root, as well as commercial samples, were carried out, showing an overall higher content of saponins in rhizome, followed by main root, branch root, and fibrous root. Ginsenoside Rb2 was proposed as a potential marker with a content of 0.5 mg/g as a threshold value for differentiating rhizome from other parts. Multivariate analysis of the metabolite profile further suggested 32 saponins as potential markers for the discrimination of different parts of notoginseng. Collectively, the study provided comprehensive chemical evidence for the distinct usage of different parts of notoginseng and, hence, is of great importance for the rational application and exploitation of individual parts of notoginseng. PMID:25118819
Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S
2014-02-11
Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001.
Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S
2014-01-01
Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001 PMID:24520159
NASA Astrophysics Data System (ADS)
Cederman, L.-E.; Conte, R.; Helbing, D.; Nowak, A.; Schweitzer, F.; Vespignani, A.
2012-11-01
A huge flow of quantitative social, demographic and behavioral data is becoming available that traces the activities and interactions of individuals, social patterns, transportation infrastructures and travel fluxes. This has caused, together with innovative computational techniques and methods for modeling social actions in hybrid (natural and artificial) societies, a qualitative change in the ways we model socio-technical systems. For the first time, society can be studied in a comprehensive fashion that addresses social and behavioral complexity. In other words we are in the position to envision the development of large data and computational cyber infrastructure defining an exploratory of society that provides quantitative anticipatory, explanatory and scenario analysis capabilities ranging from emerging infectious disease to conflict and crime surges. The goal of the exploratory of society is to provide the basic infrastructure embedding the framework of tools and knowledge needed for the design of forecast/anticipatory/crisis management approaches to socio technical systems, supporting future decision making procedures by accelerating the scientific cycle that goes from data generation to predictions.
Study on Quality Standard of Processed Curcuma Longa Radix
Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo
2017-01-01
To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640
Chang, Yan-Li; Shen, Meng; Ren, Xue-Yang; He, Ting; Wang, Le; Fan, Shu-Sheng; Wang, Xiu-Huan; Li, Xiao; Wang, Xiao-Ping; Chen, Xiao-Yi; Sui, Hong; She, Gai-Mei
2018-04-19
Thymus quinquecostatus Celak is a species of thyme in China and it used as condiment and herbal medicine for a long time. To set up the quality evaluation of T. quinquecostatus , the response surface methodology (RSM) based on its 2,2-Diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activity was introduced to optimize the extraction condition, and the main indicator components were found through an UPLC-LTQ-Orbitrap MS n method. The ethanol concentration, solid-liquid ratio, and extraction time on optimum conditions were 42.32%, 1:17.51, and 1.8 h, respectively. 35 components having 12 phenolic acids and 23 flavonoids were unambiguously or tentatively identified both positive and negative modes to employ for the comprehensive analysis in the optimum anti-oxidative part. A simple, reliable, and sensitive HPLC method was performed for the multi-component quantitative analysis of T. quinquecostatus using six characteristic and principal phenolic acids and flavonoids as reference compounds. Furthermore, the chemometrics methods (principal components analysis (PCA) and hierarchical clustering analysis (HCA)) appraised the growing areas and harvest time of this herb closely relative to the quality-controlled. This study provided full-scale qualitative and quantitative information for the quality evaluation of T. quinquecostatus , which would be a valuable reference for further study and development of this herb and related laid the foundation of further study on its pharmacological efficacy.
[Comprehensibility of online-based patient education material in ophthalmology].
Heim, N; Faron, A; Fuchs, J; Martini, M; Reich, R H; Löffler, K
2017-05-01
Investigations have shown that the internet as a source of information in medical issues is increasing in importance. For most patients information delivered or supported by hospitals and universities is considered to be the most reliable, however, the comprehensibility of available information is often considered to be wanting. Comprehensibility scores are formulae allowing a quantitative value for the readability of a document to be calculated. The purpose of this study was to assess data by analyzing the comprehensibility of medical information published on the websites of departments for ophthalmology of German university hospitals. We investigated and analyzed medical information dealing with three eye diseases with potentially severe irreversible damage. The websites of 32 departments for ophthalmology of German university hospitals were investigated. Information regarding cataracts, glaucoma and retinal detachment (amotio retinae) were identified and analyzed. All information was systematically analyzed regarding comprehensibility by using the analysis program Text-Lab ( http://www.text-lab.de ) by calculation of five readability scores: the Hohenheim comprehensibility index (HVI), the Amstad index, the simple measure of gobbledygook (G-SMOG) index, the Vienna non-fictional text formula (W-STX) and the readability index (LIX). In 59 cases (61.46 %) useful text information from the homepage of the institutions could be detected and analyzed. On average the comprehensibility of the information was identified as being poor (HVI 7.91 ± 3.94, Amstad index 35.45 ± 11.85, Vienna formula 11.19 ± 1.93, G‑SMOG 9.77 ± 1.42 and the LIX 54.53 ± 6.67). In most of the cases patient information material was written far above the literacy level of the average population. It must be assumed that the presented information is difficult to read for the majority of the patients. A critical evaluation of accessible information material seems to be desirable and available texts should be amended.
Popp, Oliver; Müller, Dirk; Didzus, Katharina; Paul, Wolfgang; Lipsmeier, Florian; Kirchner, Florian; Niklas, Jens; Mauch, Klaus; Beaucamp, Nicola
2016-09-01
In-depth characterization of high-producer cell lines and bioprocesses is vital to ensure robust and consistent production of recombinant therapeutic proteins in high quantity and quality for clinical applications. This requires applying appropriate methods during bioprocess development to enable meaningful characterization of CHO clones and processes. Here, we present a novel hybrid approach for supporting comprehensive characterization of metabolic clone performance. The approach combines metabolite profiling with multivariate data analysis and fluxomics to enable a data-driven mechanistic analysis of key metabolic traits associated with desired cell phenotypes. We applied the methodology to quantify and compare metabolic performance in a set of 10 recombinant CHO-K1 producer clones and a host cell line. The comprehensive characterization enabled us to derive an extended set of clone performance criteria that not only captured growth and product formation, but also incorporated information on intracellular clone physiology and on metabolic changes during the process. These criteria served to establish a quantitative clone ranking and allowed us to identify metabolic differences between high-producing CHO-K1 clones yielding comparably high product titers. Through multivariate data analysis of the combined metabolite and flux data we uncovered common metabolic traits characteristic of high-producer clones in the screening setup. This included high intracellular rates of glutamine synthesis, low cysteine uptake, reduced excretion of aspartate and glutamate, and low intracellular degradation rates of branched-chain amino acids and of histidine. Finally, the above approach was integrated into a workflow that enables standardized high-content selection of CHO producer clones in a high-throughput fashion. In conclusion, the combination of quantitative metabolite profiling, multivariate data analysis, and mechanistic network model simulations can identify metabolic traits characteristic of high-performance clones and enables informed decisions on which clones provide a good match for a particular process platform. The proposed approach also provides a mechanistic link between observed clone phenotype, process setup, and feeding regimes, and thereby offers concrete starting points for subsequent process optimization. Biotechnol. Bioeng. 2016;113: 2005-2019. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Gunawardena, Harsha P.; Feltcher, Meghan E.; Wrobel, John A.; Gu, Sheng; Braunstein, Miriam; Chen, Xian
2015-01-01
The Mycobacterium tuberculosis (MTB) membrane is rich in antigens that are potential targets for diagnostics and the development of new vaccines. To better understand the mechanisms underlying MTB virulence and identify new targets for therapeutic intervention we investigated the differential composition of membrane proteomes between virulent M. tuberculosis H37Rv (MTB) and the Mycobacterium bovis BCG vaccine strain. To compare the membrane proteomes, we used LC-MS/MS analysis in combination with label-free quantitative (LFQ) proteomics, utilizing the area-under-curve (AUC) of the extracted ion chromatograms (XIC) of peptides obtained from m/z and retention time alignment of MS1 features. With this approach, we obtained relative abundance ratios for 2,203 identified membrane-associated proteins in high confidence. Of these proteins, 294 showed statistically significant differences of at least 2 fold, in relative abundance between MTB and BCG membrane fractions. Our comparative analysis detected several proteins associated with known genomic regions of difference between MTB and BCG as being absent, which validated the accuracy of our approach. In further support of our label-free quantitative data, we verified select protein differences by immunoblotting. To our knowledge we have generated the first comprehensive and high coverage profile of comparative membrane proteome changes between virulent MTB and its attenuated relative BCG, which helps elucidate the proteomic basis of the intrinsic virulence of the MTB pathogen. PMID:24093440
Wang, Qin-Qin; Shen, Tao; Zuo, Zhi-Tian; Huang, Heng-Yu; Wang, Yuan-Zhong
2018-03-01
The accumulation of secondary metabolites of traditional Chinese medicine (TCM) is closely related to its origins. The identification of origins and multi-components quantitative evaluation are of great significance to ensure the quality of medicinal materials. In this study, the identification of Gentiana rigescens from different geographical origins was conducted by data fusion of Fourier transform infrared (FTIR) spectroscopy and high performance liquid chromatography (HPLC) in combination of partial least squares discriminant analysis; meanwhile quantitative analysis of index components was conducted to provide an accurate and comprehensive identification and quality evaluation strategy for selecting the best production areas of G. rigescens. In this study, the FTIR and HPLC information of 169 G. rigescens samples from Yunnan, Sichuan, Guangxi and Guizhou Provinces were collected. The raw infrared spectra were pre-treated by multiplicative scatter correction, standard normal variate (SNV) and Savitzky-Golay (SG) derivative. Then the performances of FTIR, HPLC, and low-level data fusion and mid-level data fusion for identification were compared, and the contents of gentiopicroside, swertiamarin, loganic acid and sweroside were determined by HPLC. The results showed that the FTIR spectra of G. rigescens from different geographical origins were different, and the best pre-treatment method was SNV+SG-derivative (second derivative, 15 as the window parameter, and 2 as the polynomial order). The results showed that the accuracy rate of low- and mid-level data fusion (96.43%) in prediction set was higher than that of FTIR and HPLC (94.64%) in prediction set. In addition, the accuracy of low-level data fusion (100%) in the training set was higher than that of mid-level data fusion (99.12%) in training set. The contents of the iridoid glycosides in Yunnan were the highest among different provinces. The average content of gentiopicroside, as a bioactive marker in Chinese pharmacopoeia, was 47.40 mg·g⁻¹, and the maximum was 79.83 mg·g⁻¹. The contents of loganic acid, sweroside and gentiopicroside in Yunnan were significantly different from other provinces ( P <0.05). In comparison of total content of iridoid glycosides in G. rigescens with different geographical origins in Yunnan, it was found that the amount of iridoid glycosides was higher in Eryuan Dali (68.59 mg·g⁻¹) and Yulong Lijiang (66.68 mg·g⁻¹), significantly higher than that in Wuding Chuxiong (52.99 mg·g⁻¹), Chengjiang Yuxi (52.29 mg·g⁻¹) and Xundian Kunming (46.71 mg·g⁻¹) ( P <0.05), so these two places can be used as a reference region for screening cultivation and excellent germplasm resources of G. rigescens. A comprehensive and accurate method was established by data fusion of HPLC-FTIR and quantitative analysis of HPLC for identification and quality evaluation of G. rigescens, which could provide a support for the development and utilization of G. rigescens. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
Antipsychotic-induced weight gain: a comprehensive research synthesis.
Allison, D B; Mentore, J L; Heo, M; Chandler, L P; Cappelleri, J C; Infante, M C; Weiden, P J
1999-11-01
The purpose of this study was to estimate and compare the effects of antipsychotics-both the newer ones and the conventional ones-on body weight. A comprehensive literature search identified 81 English- and non-English-language articles that included data on weight change in antipsychotic-treated patients. For each agent, a meta-analysis and random effects metaregression estimated the weight change after 10 weeks of treatment at a standard dose. A comprehensive narrative review was also conducted on all articles that did not yield quantitative information but did yield important qualitative information. Placebo was associated with a mean weight reduction of 0.74 kg. Among conventional agents, mean weight change ranged from a reduction of 0.39 kg with molindone to an increase of 3.19 kg with thioridazine. Among newer antipsychotic agents, mean increases were as follows: clozapine, 4.45 kg; olanzapine, 4.15 kg; sertindole, 2.92 kg; risperidone, 2.10 kg; and ziprasidone, 0.04 kg. Insufficient data were available to evaluate quetiapine at 10 weeks. Both conventional and newer antipsychotics are associated with weight gain. Among the newer agents, clozapine appears to have the greatest potential to induce weight gain, and ziprasidone the least. The differences among newer agents may affect compliance with medication and health risk.
Evaluation of risk communication in a mammography patient decision aid.
Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B
2016-07-01
We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Evaluation of risk communication in a mammography patient decision aid
Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.
2016-01-01
Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020
Sudhagar, Arun; El-Matbouli, Mansour
2018-01-01
In recent years, with the advent of next-generation sequencing along with the development of various bioinformatics tools, RNA sequencing (RNA-Seq)-based transcriptome analysis has become much more affordable in the field of biological research. This technique has even opened up avenues to explore the transcriptome of non-model organisms for which a reference genome is not available. This has made fish health researchers march towards this technology to understand pathogenic processes and immune reactions in fish during the event of infection. Recent studies using this technology have altered and updated the previous understanding of many diseases in fish. RNA-Seq has been employed in the understanding of fish pathogens like bacteria, virus, parasites, and oomycetes. Also, it has been helpful in unraveling the immune mechanisms in fish. Additionally, RNA-Seq technology has made its way for future works, such as genetic linkage mapping, quantitative trait analysis, disease-resistant strain or broodstock selection, and the development of effective vaccines and therapies. Until now, there are no reviews that comprehensively summarize the studies which made use of RNA-Seq to explore the mechanisms of infection of pathogens and the defense strategies of fish hosts. This review aims to summarize the contemporary understanding and findings with regard to infectious pathogens and the immune system of fish that have been achieved through RNA-Seq technology. PMID:29342931
NASA Astrophysics Data System (ADS)
Liu, Jiachao; Li, Ziyi; Chen, Kewei; Yao, Li; Wang, Zhiqun; Li, Kunchen; Guo, Xiaojuan
2011-03-01
Gray matter volume and cortical thickness are two indices of concern in brain structure magnetic resonance imaging research. Gray matter volume reflects mixed-measurement information of cerebral cortex, while cortical thickness reflects only the information of distance between inner surface and outer surface of cerebral cortex. Using Scaled Subprofile Modeling based on Principal Component Analysis (SSM_PCA) and Pearson's Correlation Analysis, this study further provided quantitative comparisons and depicted both global relevance and local relevance to comprehensively investigate morphometrical abnormalities in cerebral cortex in Alzheimer's disease (AD). Thirteen patients with AD and thirteen age- and gender-matched healthy controls were included in this study. Results showed that factor scores from the first 8 principal components accounted for ~53.38% of the total variance for gray matter volume, and ~50.18% for cortical thickness. Factor scores from the fifth principal component showed significant correlation. In addition, gray matter voxel-based volume was closely related to cortical thickness alterations in most cortical cortex, especially, in some typical abnormal brain regions such as insula and the parahippocampal gyrus in AD. These findings suggest that these two measurements are effective indices for understanding the neuropathology in AD. Studies using both gray matter volume and cortical thickness can separate the causes of the discrepancy, provide complementary information and carry out a comprehensive description of the morphological changes of brain structure.
Cortessis, Victoria K; Azadian, Moosa; Buxbaum, James; Sanogo, Fatimata; Song, Ashley Y; Sriprasert, Intira; Wei, Pengxiao C; Yu, Jing; Chung, Karine; Siegmund, Kimberly D
2018-04-25
To determine whether a history of conception by assisted reproductive technology (ART) is associated with occurrence of one or more imprinting disorders of either maternal or paternal origin. We implemented a systematic review of scholarly literature followed by comprehensive meta-analysis to quantitatively synthesize data from reports relating to use of ART to occurrence of any imprinting disorder of humans, including Beckwith-Wiedemann (BWS), Angelman (AS), Prader-Willi (PWS), and Silver-Russell (SRS) syndromes, as well as transient neonatal diabetes mellitus (TNDB) and sporadic retinoblasoma (RB). The systematic review identified 13 reports presenting unique data from 23 studies that related conception following ART to occurrence of imprinting disorders. Multiple studies of four disorder were identified, for which meta-analysis yielded the following summary estimates of associations with a history of ART: AS, summary odds ratio (sOR) = 4.7 (95% confidence interval (CI) 2.6-8.5, 4 studies); BWS, sOR = 5.8 (95% CI 3.1-11.1, 8 studies); PWS, sOR = 2.2 (95% CI 1.6-3.0, 6 studies); SRS, sOR = 11.3 (95% CI 4.5-28.5, 3 studies). Only one study reported on each of TNDB and RB. Published data reveal positive associations between history of ART conception and each of four imprinting disorders. Reasons for these associations warrant further investigation.
Burchill, Susan A; Beiske, Klaus; Shimada, Hiroyuki; Ambros, Peter F; Seeger, Robert; Tytgat, Godelieve A M; Brock, Penelope R; Haber, Michelle; Park, Julie R; Berthold, Frank
2017-04-01
The current study was conducted to expedite international standardized reporting of bone marrow disease in children with neuroblastoma and to improve equivalence of care. A multidisciplinary International Neuroblastoma Response Criteria Bone Marrow Working Group was convened by the US National Cancer Institute in January 2012 with representation from Europe, North America, and Australia. Practical transferable recommendations to standardize the reporting of bone marrow disease were developed. To the authors' knowledge, the current study is the first to comprehensively present consensus criteria for the collection, analysis, and reporting of the percentage area of bone marrow parenchyma occupied by tumor cells in trephine-biopsies. The quantitative analysis of neuroblastoma content in bone marrow aspirates by immunocytology and reverse transcriptase-quantitative polymerase chain reaction are revised. The inclusion of paired-like homeobox 2b (PHOX2B) for immunohistochemistry and reverse transcriptase-quantitative polymerase chain reaction is recommended. Recommendations for recording bone marrow response are provided. The authors endorse the quantitative assessment of neuroblastoma cell content in bilateral core needle biopsies-trephines and aspirates in all children with neuroblastoma, with the exception of infants, in whom the evaluation of aspirates alone is advised. It is interesting to note that 5% disease is accepted as an internationally achievable level for disease assessment. The quantitative assessment of neuroblastoma cells is recommended to provide data from which evidence-based numerical criteria for the reporting of bone marrow response can be realized. This is particularly important in the minimal disease setting and when neuroblastoma detection in bone marrow is intermittent, where clinical impact has yet to be validated. The wide adoption of these harmonized criteria will enhance the ability to compare outcomes from different trials and facilitate collaborative trial design. Cancer 2017;123:1095-1105. © 2016 American Cancer Society. © 2016 American Cancer Society.
Comprehensive chlorophyll composition in the main edible seaweeds.
Chen, Kewei; Ríos, José Julián; Pérez-Gálvez, Antonio; Roca, María
2017-08-01
Natural chlorophylls present in seaweeds have been studied regarding their biological activities and health benefit effects. However, detailed studies regarding characterization of the complete chlorophyll profile either qualitatively and quantitatively are scarce. This work deals with the comprehensive spectrometric study of the chlorophyll derivatives present in the five main coloured edible seaweeds. The novel complete MS 2 characterization of five chlorophyll derivatives: chlorophyll c 2 , chlorophyll c 1 , purpurin-18 a, pheophytin d and phytyl-purpurin-18 a has allowed to obtain fragmentation patterns associated with their different structural features. New chlorophyll derivatives have been identified and quantified by first time in red, green and brown seaweeds, including some oxidative structures. Quantitative data of the chlorophyll content comes to achieve significant information for food composition databases in bioactive compounds. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M
2014-05-01
We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype Questionnaire, the Social Responsiveness Scale2-Adult Self report (SRS2-AS), and the Autism-Spectrum Quotient (AQ). The SRS2-AS and the AQ each had several short forms that we also examined, bringing the total to 11 forms. Though all QAT questionnaires showed acceptable levels of test-retest reliability, the AQ and SRS2-AS, including their short forms, exhibited poor internal consistency and discriminant validity, respectively. The SATQ excelled in terms of classical test theory and due to its short length.
Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K
2015-07-05
We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.
ERIC Educational Resources Information Center
Spencer, Mercedes; Wagner, Richard K.
2018-01-01
The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language…
Sun, Na; Walch, Axel
2013-08-01
Mass spectrometry imaging (MSI) is a rapidly evolving technology that yields qualitative and quantitative distribution maps of small pharmaceutical-active molecules and their metabolites in tissue sections in situ. The simplicity, high sensitivity and ability to provide comprehensive spatial distribution maps of different classes of biomolecules make MSI a valuable tool to complement histopathology for diagnostics and biomarker discovery. In this review, qualitative and quantitative MSI of drugs and metabolites in tissue at therapeutic levels are discussed and the impact of this technique in drug discovery and clinical research is highlighted.
Standardization approaches in absolute quantitative proteomics with mass spectrometry.
Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo
2017-07-31
Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.
Chen, Teresa C.
2009-01-01
Purpose: To demonstrate that video-rate spectral domain optical coherence tomography (SDOCT) can qualitatively and quantitatively evaluate optic nerve head (ONH) and retinal nerve fiber layer (RNFL) glaucomatous structural changes. To correlate quantitative SDOCT parameters with disc photography and visual fields. Methods: SDOCT images from 4 glaucoma eyes (4 patients) with varying stages of open-angle glaucoma (ie, early, moderate, late) were qualitatively contrasted with 2 age-matched normal eyes (2 patients). Of 61 other consecutive patients recruited in an institutional setting, 53 eyes (33 patients) met inclusion/exclusion criteria for quantitative studies. Images were obtained using two experimental SDOCT systems, one utilizing a superluminescent diode and the other a titanium:sapphire laser source, with axial resolutions of about 6 μm and 3 μm, respectively. Results: Classic glaucomatous ONH and RNFL structural changes were seen in SDOCT images. An SDOCT reference plane 139 μm above the retinal pigment epithelium yielded cup-disc ratios that best correlated with masked physician disc photography cup-disc ratio assessments. The minimum distance band, a novel SDOCT neuroretinal rim parameter, showed good correlation with physician cup-disc ratio assessments, visual field mean deviation, and pattern standard deviation (P values range, .0003–.024). RNFL and retinal thickness maps correlated well with disc photography and visual field testing. Conclusions: To our knowledge, this thesis presents the first comprehensive qualitative and quantitative evaluation of SDOCT images of the ONH and RNFL in glaucoma. This pilot study provides basis for developing more automated quantitative SDOCT-specific glaucoma algorithms needed for future prospective multicenter national trials. PMID:20126502
Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...
2014-01-01
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
Binary similarity measures for fingerprint analysis of qualitative metabolomic profiles.
Rácz, Anita; Andrić, Filip; Bajusz, Dávid; Héberger, Károly
2018-01-01
Contemporary metabolomic fingerprinting is based on multiple spectrometric and chromatographic signals, used either alone or combined with structural and chemical information of metabolic markers at the qualitative and semiquantitative level. However, signal shifting, convolution, and matrix effects may compromise metabolomic patterns. Recent increase in the use of qualitative metabolomic data, described by the presence (1) or absence (0) of particular metabolites, demonstrates great potential in the field of metabolomic profiling and fingerprint analysis. The aim of this study is a comprehensive evaluation of binary similarity measures for the elucidation of patterns among samples of different botanical origin and various metabolomic profiles. Nine qualitative metabolomic data sets covering a wide range of natural products and metabolomic profiles were applied to assess 44 binary similarity measures for the fingerprinting of plant extracts and natural products. The measures were analyzed by the novel sum of ranking differences method (SRD), searching for the most promising candidates. Baroni-Urbani-Buser (BUB) and Hawkins-Dotson (HD) similarity coefficients were selected as the best measures by SRD and analysis of variance (ANOVA), while Dice (Di1), Yule, Russel-Rao, and Consonni-Todeschini 3 ranked the worst. ANOVA revealed that concordantly and intermediately symmetric similarity coefficients are better candidates for metabolomic fingerprinting than the asymmetric and correlation based ones. The fingerprint analysis based on the BUB and HD coefficients and qualitative metabolomic data performed equally well as the quantitative metabolomic profile analysis. Fingerprint analysis based on the qualitative metabolomic profiles and binary similarity measures proved to be a reliable way in finding the same/similar patterns in metabolomic data as that extracted from quantitative data.
Guan, Yong-mei; Jin, Chen; Zhu, Wei-feng; Yang, Ming
2018-01-01
Fermented Cordyceps sinensis, the succedaneum of Cordyceps sinensis which is extracted and separated from Cordyceps sinensis by artificial fermentation, is commonly used in eastern Asia in clinical treatments due to its health benefit. In this paper, a new strategy for differentiating and comprehensively evaluating the quality of products of fermented Cordyceps sinensis has been established, based on high-performance liquid chromatography (HPLC) fingerprint analysis combined with similar analysis (SA), hierarchical cluster analysis (HCA), and the quantitative analysis of multicomponents by single marker (QAMS). Ten common peaks were collected and analysed using SA, HCA, and QAMS. These methods indicated that 30 fermented Cordyceps sinensis samples could be categorized into two groups by HCA. Five peaks were identified as uracil, uridine, adenine, guanosine, and adenosine, and according to the results from the diode array detector, which can be used to confirm peak purity, the purities of these compounds were greater than 990. Adenosine was chosen as the internal reference substance. The relative correction factors (RCF) between adenosine and the other four nucleosides were calculated and investigated using the QAMS method. Meanwhile, the accuracy of the QAMS method was confirmed by comparing the results of that method with those of an external standard method with cosines of the angles between the groups. No significant difference between the two methods was observed. In conclusion, the method established herein was efficient, successful in identifying the products of fermented Cordyceps sinensis, and scientifically valid to be applicable in the systematic quality control of fermented Cordyceps sinensis products. PMID:29850373
Chen, Li-Hua; Wu, Yao; Guan, Yong-Mei; Jin, Chen; Zhu, Wei-Feng; Yang, Ming
2018-01-01
Fermented Cordyceps sinensis , the succedaneum of Cordyceps sinensis which is extracted and separated from Cordyceps sinensis by artificial fermentation, is commonly used in eastern Asia in clinical treatments due to its health benefit. In this paper, a new strategy for differentiating and comprehensively evaluating the quality of products of fermented Cordyceps sinensis has been established, based on high-performance liquid chromatography (HPLC) fingerprint analysis combined with similar analysis (SA), hierarchical cluster analysis (HCA), and the quantitative analysis of multicomponents by single marker (QAMS). Ten common peaks were collected and analysed using SA, HCA, and QAMS. These methods indicated that 30 fermented Cordyceps sinensis samples could be categorized into two groups by HCA. Five peaks were identified as uracil, uridine, adenine, guanosine, and adenosine, and according to the results from the diode array detector, which can be used to confirm peak purity, the purities of these compounds were greater than 990. Adenosine was chosen as the internal reference substance. The relative correction factors (RCF) between adenosine and the other four nucleosides were calculated and investigated using the QAMS method. Meanwhile, the accuracy of the QAMS method was confirmed by comparing the results of that method with those of an external standard method with cosines of the angles between the groups. No significant difference between the two methods was observed. In conclusion, the method established herein was efficient, successful in identifying the products of fermented Cordyceps sinensis , and scientifically valid to be applicable in the systematic quality control of fermented Cordyceps sinensis products.
NASA Astrophysics Data System (ADS)
Beach, Daniel G.
2017-08-01
Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.
Adriaens, Michiel E; Prickaerts, Peggy; Chan-Seng-Yue, Michelle; van den Beucken, Twan; Dahlmans, Vivian E H; Eijssen, Lars M; Beck, Timothy; Wouters, Bradly G; Voncken, Jan Willem; Evelo, Chris T A
2016-01-01
A comprehensive assessment of the epigenetic dynamics in cancer cells is the key to understanding the molecular mechanisms underlying cancer and to improving cancer diagnostics, prognostics and treatment. By combining genome-wide ChIP-seq epigenomics and microarray transcriptomics, we studied the effects of oxygen deprivation and subsequent reoxygenation on histone 3 trimethylation of lysine 4 (H3K4me3) and lysine 27 (H3K27me3) in a breast cancer cell line, serving as a model for abnormal oxygenation in solid tumors. A priori, epigenetic markings and gene expression levels not only are expected to vary greatly between hypoxic and normoxic conditions, but also display a large degree of heterogeneity across the cell population. Where traditionally ChIP-seq data are often treated as dichotomous data, the model and experiment here necessitate a quantitative, data-driven analysis of both datasets. We first identified genomic regions with sustained epigenetic markings, which provided a sample-specific reference enabling quantitative ChIP-seq data analysis. Sustained H3K27me3 marking was located around centromeres and intergenic regions, while sustained H3K4me3 marking is associated with genes involved in RNA binding, translation and protein transport and localization. Dynamic marking with both H3K4me3 and H3K27me3 (hypoxia-induced bivalency) was found in CpG-rich regions at loci encoding factors that control developmental processes, congruent with observations in embryonic stem cells. In silico -identified epigenetically sustained and dynamic genomic regions were confirmed through ChIP-PCR in vitro, and obtained results are corroborated by published data and current insights regarding epigenetic regulation.
Willett, N J; Thote, T; Hart, M; Moran, S; Guldberg, R E; Kamath, R V
2016-09-01
The development of effective therapies for cartilage protection has been limited by a lack of efficient quantitative cartilage imaging modalities in pre-clinical in vivo models. Our objectives were two-fold: first, to validate a new contrast-enhanced 3D imaging analysis technique, equilibrium partitioning of an ionic contrast agent-micro computed tomography (EPIC-μCT), in a rat medial meniscal transection (MMT) osteoarthritis (OA) model; and second, to quantitatively assess the sensitivity of EPIC-μCT to detect the effects of matrix metalloproteinase inhibitor (MMPi) therapy on cartilage degeneration. Rats underwent MMT surgery and tissues were harvested at 1, 2, and 3 weeks post-surgery or rats received an MMPi or vehicle treatment and tissues harvested 3 weeks post-surgery. Parameters of disease progression were evaluated using histopathology and EPIC-μCT. Correlations and power analyses were performed to compare the techniques. EPIC-μCT was shown to provide simultaneous 3D quantification of multiple parameters, including cartilage degeneration and osteophyte formation. In MMT animals treated with MMPi, OA progression was attenuated, as measured by 3D parameters such as lesion volume and osteophyte size. A post-hoc power analysis showed that 3D parameters for EPIC-μCT were more sensitive than 2D parameters requiring fewer animals to detect a therapeutic effect of MMPi. 2D parameters were comparable between EPIC-μCT and histopathology. This study demonstrated that EPIC-μCT has high sensitivity to provide 3D structural and compositional measurements of cartilage and bone in the joint. EPIC-μCT can be used in combination with histology to provide a comprehensive analysis to screen new potential therapies. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Linking Advanced Visualization and MATLAB for the Analysis of 3D Gene Expression Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver; Keranen, Soile V.E.; Biggin, Mark
Three-dimensional gene expression PointCloud data generated by the Berkeley Drosophila Transcription Network Project (BDTNP) provides quantitative information about the spatial and temporal expression of genes in early Drosophila embryos at cellular resolution. The BDTNP team visualizes and analyzes Point-Cloud data using the software application PointCloudXplore (PCX). To maximize the impact of novel, complex data sets, such as PointClouds, the data needs to be accessible to biologists and comprehensible to developers of analysis functions. We address this challenge by linking PCX and Matlab via a dedicated interface, thereby providing biologists seamless access to advanced data analysis functions and giving bioinformatics researchersmore » the opportunity to integrate their analysis directly into the visualization application. To demonstrate the usefulness of this approach, we computationally model parts of the expression pattern of the gene even skipped using a genetic algorithm implemented in Matlab and integrated into PCX via our Matlab interface.« less
Han, Shuting; Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-03-28
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra , extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. © 2018, Han et al.
Cross-correlations between RMB exchange rate and international commodity markets
NASA Astrophysics Data System (ADS)
Lu, Xinsheng; Li, Jianfeng; Zhou, Ying; Qian, Yubo
2017-11-01
This paper employs multifractal detrended analysis (MF-DFA) and multifractal detrended cross-correlation analysis (MF-DCCA) to study cross-correlation behaviors between China's RMB exchange rate market and four international commodity markets, using a comprehensive set of data covering the period from 22 July 2005 to 15 March 2016. Our empirical results from MF-DFA indicate that the RMB exchange rate is the most inefficient among the 4 selected markets. The results from quantitative analysis have testified the existence of cross-correlations and the result from MF-DCCA have further confirmed a strong multifractal behavior between RMB exchange rate and international commodity markets. We also demonstrate that the recent financial crisis has significant impact on the cross-correlated behavior. Through the rolling window analysis, we find that the RMB exchange rates and international commodity prices are anti-persistent cross-correlated. The main sources of multifractality in the cross-correlations are long-range correlations between RMB exchange rate and the aggregate commodity, energy and metals index.
Life cycle cost analysis of a stand-alone PV system in rural Kenya
NASA Astrophysics Data System (ADS)
Daly, Emma
The purpose of this quantitative research study was to determine the economic feasibility of a stand-alone PV system to electrify a rural area in Kenya. The research conducted involved a comprehensive review of all the relevant literature associated with the study. Methodologies were extrapolated from this extensive literature to develop a model for the complete design and economic analysis of a stand-alone PV system. A women's center in rural Kenya was used as a worked example to demonstrate the workings of the model. The results suggest that electrifying the center using a stand-alone PV system is an economically viable option which is encouraging for the surrounding area. This model can be used as a business model to determine the economic feasibility of a stand-alone PV system in alternative sites in Kenya.
Evaluation of electron mobility in InSb quantum wells by means of percentage-impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishima, T. D.; Edirisooriya, M.; Santos, M. B.
2014-05-15
In order to quantitatively analyze the contribution of each scattering factor toward the total carrier mobility, we use a new convenient figure-of-merit, named a percentage impact. The mobility limit due to a scattering factor, which is widely used to summarize a scattering analysis, has its own advantage. However, a mobility limit is not quite appropriate for the above purpose. A comprehensive understanding of the difference in contribution among many scattering factors toward the total carrier mobility can be obtained by evaluating percentage impacts of scattering factors, which can be straightforwardly calculated from their mobility limits and the total mobility. Ourmore » percentage impact analysis shows that threading dislocation is one of the dominant scattering factors for the electron transport in InSb quantum wells at room temperature.« less
Mass Spectrometry Analysis of Spatial Protein Networks by Colocalization Analysis (COLA).
Mardakheh, Faraz K
2017-01-01
A major challenge in systems biology is comprehensive mapping of protein interaction networks. Crucially, such interactions are often dynamic in nature, necessitating methods that can rapidly mine the interactome across varied conditions and treatments to reveal change in the interaction networks. Recently, we described a fast mass spectrometry-based method to reveal functional interactions in mammalian cells on a global scale, by revealing spatial colocalizations between proteins (COLA) (Mardakheh et al., Mol Biosyst 13:92-105, 2017). As protein localization and function are inherently linked, significant colocalization between two proteins is a strong indication for their functional interaction. COLA uses rapid complete subcellular fractionation, coupled with quantitative proteomics to generate a subcellular localization profile for each protein quantified by the mass spectrometer. Robust clustering is then applied to reveal significant similarities in protein localization profiles, indicative of colocalization.
Numerical study of read scheme in one-selector one-resistor crossbar array
NASA Astrophysics Data System (ADS)
Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin
2015-12-01
A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.
David, Helena Maria Scherlowski Leal; Caufield, Catherine
2005-01-01
This exploratory study aimed to investigate factors related to the use of illicit and licit drugs and workplace violence in a group of women from popular classes in the city of Rio de Janeiro. We used a descriptive and analytic quantitative approach was used, as well as a qualitative approach through in-depth interviews with women who suffered or were suffering workplace violence, using the collective subject discourse analysis methodology. The results showed sociodemographic and work situations that can be considered as possible risk factors for drug consumption and workplace violence. The qualitative analysis shows how this group perceives the phenomena of drug use and workplace violence, expanding the comprehension about these issues and providing conceptual and methodological elements for additional studies on this subject.
Weishaar, Heide; Amos, Amanda; Collin, Jeff
2015-05-01
Networks and coalitions of stakeholders play a crucial role in the development and implementation of policies, with previous research highlighting that networks in tobacco control are characterised by an antagonism between supporters and opponents of comprehensive tobacco control policies. This UK-based study used quantitative and qualitative network analysis (drawing on 176 policy submissions and 32 interviews) to systematically map and analyse a network of actors involved in the development of European Union (EU) smoke-free policy. Policy debates were dominated by two coalitions of stakeholders with starkly opposing positions on the issue. One coalition, consisting primarily of health-related organisations, supported comprehensive EU smoke-free policy, whereas the other, led by tobacco manufacturers' organisations, opposed the policy initiative. The data suggest that, aided by strong political commitment of EU decision makers to develop smoke-free policy, advocates supporting comprehensive EU policy were able to frame policy debates in ways which challenged the tobacco industry's legitimacy. They then benefited from the stark polarisation between the two coalitions. The paper provides empirical evidence of the division between two distinct coalitions in tobacco policy debates and draws attention to the complex processes of consensus-seeking, alliance-building and strategic action which are integral to the development of EU policy. Highlighting network polarisation and industry isolation as factors which seemed to increase tobacco control success, the study demonstrates the potential significance and value of FCTC article 5.3 for tobacco control policy-making. Copyright © 2015 Elsevier Ltd. All rights reserved.
Enhanced Lipidome Coverage in Shotgun Analyses by using Gas-Phase Fractionation
NASA Astrophysics Data System (ADS)
Nazari, Milad; Muddiman, David C.
2016-11-01
A high resolving power shotgun lipidomics strategy using gas-phase fractionation and data-dependent acquisition (DDA) was applied toward comprehensive characterization of lipids in a hen ovarian tissue in an untargeted fashion. Using this approach, a total of 822 unique lipids across a diverse range of lipid categories and classes were identified based on their MS/MS fragmentation patterns. Classes of glycerophospholipids and glycerolipids, such as glycerophosphocholines (PC), glycerophosphoethanolamines (PE), and triglycerides (TG), are often the most abundant peaks observed in shotgun lipidomics analyses. These ions suppress the signal from low abundance ions and hinder the chances of characterizing low abundant lipids when DDA is used. These issues were circumvented by utilizing gas-phase fractionation, where DDA was performed on narrow m/z ranges instead of a broad m/z range. Employing gas-phase fractionation resulted in an increase in sensitivity by more than an order of magnitude in both positive- and negative-ion modes. Furthermore, the enhanced sensitivity increased the number of lipids identified by a factor of ≈4, and facilitated identification of low abundant lipids from classes such as cardiolipins that are often difficult to observe in untargeted shotgun analyses and require sample-specific preparation steps prior to analysis. This method serves as a resource for comprehensive profiling of lipids from many different categories and classes in an untargeted manner, as well as for targeted and quantitative analyses of individual lipids. Furthermore, this comprehensive analysis of the lipidome can serve as a species- and tissue-specific database for confident identification of other MS-based datasets, such as mass spectrometry imaging.
Henderson, Kathryn E; Falbe, Jennifer; Novak, Sarah A.; Wharton, Christopher; Long, Michael; O'Connell, Meghan L.; Fiore, Susan S.
2013-01-01
Background In 2006, all local education agencies in the United States participating in federal school meal programs were required to establish school wellness policies. The aim of this study was to document the strength and comprehensiveness of one state's written district policies using a quantitative coding tool, and test whether the strength and comprehensiveness of the written policy predicted school level implementation and practices. Methods School wellness policies from 151 Connecticut districts were evaluated using a quantitative coding system. In each district, school principal surveys were collected before and after the writing and expected implementation of wellness policies. Socio-demographic variables were assessed for each district, including enrollment, population density, political climate, racial composition and socio-economic status. Changes in school-level policy implementation before and after the federal wellness policy requirement were compared across districts by wellness policy strength, and policies were compared based on district-level demographic factors. Results Statewide, fuller implementation of nutrition and physical activity policies at the school level was reported after adoption of written policies in 2006. Districts with stronger, more comprehensive policies were more successful in implementing those policies at the school level. Some socio-demographic characteristics predicted the strength of wellness policies; larger, urban districts and districts with a greater ratio of registered Democrats to Republicans wrote stronger policies. Conclusions Written school wellness policies have the potential to promote significant improvements in the school environment. Future regulation of school wellness policies should focus on the importance of writing strong and comprehensive policies. PMID:22568461
Lee, Mikyung; Kim, Yangseok
2009-12-16
Genomic alterations frequently occur in many cancer patients and play important mechanistic roles in the pathogenesis of cancer. Furthermore, they can modify the expression level of genes due to altered copy number in the corresponding region of the chromosome. An accumulating body of evidence supports the possibility that strong genome-wide correlation exists between DNA content and gene expression. Therefore, more comprehensive analysis is needed to quantify the relationship between genomic alteration and gene expression. A well-designed bioinformatics tool is essential to perform this kind of integrative analysis. A few programs have already been introduced for integrative analysis. However, there are many limitations in their performance of comprehensive integrated analysis using published software because of limitations in implemented algorithms and visualization modules. To address this issue, we have implemented the Java-based program CHESS to allow integrative analysis of two experimental data sets: genomic alteration and genome-wide expression profile. CHESS is composed of a genomic alteration analysis module and an integrative analysis module. The genomic alteration analysis module detects genomic alteration by applying a threshold based method or SW-ARRAY algorithm and investigates whether the detected alteration is phenotype specific or not. On the other hand, the integrative analysis module measures the genomic alteration's influence on gene expression. It is divided into two separate parts. The first part calculates overall correlation between comparative genomic hybridization ratio and gene expression level by applying following three statistical methods: simple linear regression, Spearman rank correlation and Pearson's correlation. In the second part, CHESS detects the genes that are differentially expressed according to the genomic alteration pattern with three alternative statistical approaches: Student's t-test, Fisher's exact test and Chi square test. By successive operations of two modules, users can clarify how gene expression levels are affected by the phenotype specific genomic alterations. As CHESS was developed in both Java application and web environments, it can be run on a web browser or a local machine. It also supports all experimental platforms if a properly formatted text file is provided to include the chromosomal position of probes and their gene identifiers. CHESS is a user-friendly tool for investigating disease specific genomic alterations and quantitative relationships between those genomic alterations and genome-wide gene expression profiling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, Benjamin S.; Zalavadia, Mital A.; Miller, Brian W.
Environmental sampling and sample analyses by the International Atomic Energy Agency’s (IAEA) Network of Analytical Laboratories (NWAL) is a critical technical tool used to detect facility misuse under a Comprehensive Safeguards Agreement and to verify the absence of undeclared nuclear material activities under an Additional Protocol. Currently all environmental swipe samples (ESS) are screened using gamma spectrometry and x-ray fluorescence to estimate the amount of U and/or Pu in the ESS, to guide further analysis, and to assist in the shipment of ESS to the NWAL. Quantitative Digital Autoradiography for Environmental Samples (QDARES) is being developed to complement existing techniquesmore » through the use of a portable, real-time, high-spatial-resolution camera called the Ionizing-radiation Quantum Imaging Detector (iQID). The iQID constructs a spatial map of radionuclides within a sample or surface in real-time as charged particles (betas) and photons (gamma/x-rays) are detected and localized on an event-by-event basis. Knowledge of the location and nature of radioactive hot spots on the ESS could provide information for subsequent laboratory analysis. As a nondestructive technique, QDARES does not compromise the ESS chain of custody or subsequent laboratory analysis. In this paper we will present the system design and construction, characterization measurements with calibration sources, and initial measurements of ESS.« less
Du, Gang; Zhao, Haiyu; Song, Yuelin; Zhang, Qingwen; Wang, Yitao
2011-10-01
A high-performance liquid chromatography (HPLC) coupled with triple quadrupole mass spectrometry (MS/MS) method was developed for rapid determination of 13 isoflavones in Radix puerariae. A novel shell-type column, namely Kinetex core-shell C(18) column (50 mm×2.1 mm id, 2.6 μm), and gradient elution were used during the analysis. The chromatographic peaks of 13 investigated compounds were identified by comparing their retention time and MS data with the related reference compounds. Multiple-reaction monitoring (MRM) was employed for the quantitative analysis with negative ionization mode. All calibration curves showed good linearity (r(2)>0.9990) within test ranges. The LOD and LOQ were lower than 0.017 and 0.873 μg/mL on column, respectively. The intra- and inter-day precisions for 13 analytes were <1.17 and 2.17%, respectively, and the recoveries were 93.1-104.4%. The validated method was applied for quantitative analysis of 13 isoflavones in 7 species of Radix puerariae. The result demonstrated that HPLC-MS/MS system with Kinetex column could be a promising analytical tool for the determination of isoflavones in traditional Chinese medicines, which is helpful for comprehensive evaluation of quality of R. puerariae. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Casey, Tammy M; Khan, Javed M; Bringans, Scott D; Koudelka, Tomas; Takle, Pari S; Downs, Rachael A; Livk, Andreja; Syme, Robert A; Tan, Kar-Chun; Lipscombe, Richard J
2017-02-03
This study aimed to compare the depth and reproducibility of total proteome and differentially expressed protein coverage in technical duplicates and triplicates using iTRAQ 4-plex, iTRAQ 8-plex, and TMT 6-plex reagents. The analysis was undertaken because comprehensive comparisons of isobaric mass tag reproducibility have not been widely reported in the literature. The highest number of proteins was identified with 4-plex, followed by 8-plex and then 6-plex reagents. Quantitative analyses revealed that more differentially expressed proteins were identified with 4-plex reagents than 8-plex reagents and 6-plex reagents. Replicate reproducibility was determined to be ≥69% for technical duplicates and ≥57% for technical triplicates. The results indicate that running an 8-plex or 6-plex experiment instead of a 4-plex experiment resulted in 26 or 39% fewer protein identifications, respectively. When 4-plex spectra were searched with three software tools-ProteinPilot, Mascot, and Proteome Discoverer-the highest number of protein identifications were obtained with Mascot. The analysis of negative controls demonstrated the importance of running experiments as replicates. Overall, this study demonstrates the advantages of using iTRAQ 4-plex reagents over iTRAQ 8-plex and TMT 6-plex reagents, provides estimates of technical duplicate and triplicate reproducibility, and emphasizes the value of running replicate samples.
Quantitative proteome-based systematic identification of SIRT7 substrates.
Zhang, Chaohua; Zhai, Zichao; Tang, Ming; Cheng, Zhongyi; Li, Tingting; Wang, Haiying; Zhu, Wei-Guo
2017-07-01
SIRT7 is a class III histone deacetylase that is involved in numerous cellular processes. Only six substrates of SIRT7 have been reported thus far, so we aimed to systematically identify SIRT7 substrates using stable-isotope labeling with amino acids in cell culture (SILAC) coupled with quantitative mass spectrometry (MS). Using SIRT7 +/+ and SIRT7 -/- mouse embryonic fibroblasts as our model system, we identified and quantified 1493 acetylation sites in 789 proteins, of which 261 acetylation sites in 176 proteins showed ≥2-fold change in acetylation state between SIRT7 -/- and SIRT7 +/+ cells. These proteins were considered putative SIRT7 substrates and were carried forward for further analysis. We then validated the predictive efficiency of the SILAC-MS experiment by assessing substrate acetylation status in vitro in six predicted proteins. We also performed a bioinformatic analysis of the MS data, which indicated that many of the putative protein substrates were involved in metabolic processes. Finally, we expanded our list of candidate substrates by performing a bioinformatics-based prediction analysis of putative SIRT7 substrates, using our list of putative substrates as a positive training set, and again validated a subset of the proteins in vitro. In summary, we have generated a comprehensive list of SIRT7 candidate substrates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Majeská Čudejková, Mária; Vojta, Petr; Valík, Josef; Galuszka, Petr
2016-09-25
The fungus Claviceps purpurea is a biotrophic phytopathogen widely used in the pharmaceutical industry for its ability to produce ergot alkaloids (EAs). The fungus attacks unfertilized ovaries of grasses and forms sclerotia, which represent the only type of tissue where the synthesis of EAs occurs. The biosynthetic pathway of EAs has been extensively studied; however, little is known concerning its regulation. Here, we present the quantitative transcriptome analysis of the sclerotial and mycelial tissues providing a comprehensive view of transcriptional differences between the tissues that produce EAs and those that do not produce EAs and the pathogenic and non-pathogenic lifestyle. The results indicate metabolic changes coupled with sclerotial differentiation, which are likely needed as initiation factors for EA biosynthesis. One of the promising factors seems to be oxidative stress. Here, we focus on the identification of putative transcription factors and regulators involved in sclerotial differentiation, which might be involved in EA biosynthesis. To shed more light on the regulation of EA composition, whole transcriptome analysis of four industrial strains differing in their alkaloid spectra was performed. The results support the hypothesis proposing the composition of the amino acid pool in sclerotia to be an important factor regulating the final structure of the ergopeptines produced by Claviceps purpurea. Copyright © 2016 Elsevier B.V. All rights reserved.
Napolitano, José G.; Gödecke, Tanja; Rodríguez-Brasco, María F.; Jaki, Birgit U.; Chen, Shao-Nong; Lankin, David C.; Pauli, GuidoF.
2012-01-01
Botanical dietary supplements and herbal remedies are widely used for health promotion and disease prevention. Due to the high chemical complexity of these natural products, it is essential to develop new analytical strategies to guarantee their quality and consistency. In particular, the precise characterization of multiple botanical markers remains a challenge. This study demonstrates how a combination of computer-aided spectral analysis and 1D quantitative 1H NMR spectroscopy (qHNMR) generates the analytical foundation for innovative means of simultaneously identifying and quantifying botanical markers in complex mixtures. First, comprehensive 1H NMR profiles (fingerprints) of selected botanical markers were generated via 1H iterative Full Spin Analysis (HiFSA) with PERCH. Next, the 1H fingerprints were used to assign specific 1H resonances in the NMR spectra of reference materials, enriched fractions and crude extracts of Ginkgo biloba leaves. These 1H fingerprints were then used to verify the assignments by 2D NMR. Subsequently, a complete purity and composition assessment by means of 1D qHNMR was conducted. As its major strengths, this tandem approach enables the simultaneous quantification of multiple constituents without the need for identical reference materials, the semi-quantitative determination of particular sub-classes of components, and the detection of impurities and adulterants. PMID:22332915
The impact of Public Reporting on clinical outcomes: a systematic review and meta-analysis.
Campanella, Paolo; Vukovic, Vladimir; Parente, Paolo; Sulejmani, Adela; Ricciardi, Walter; Specchia, Maria Lucia
2016-07-22
To assess both qualitatively and quantitatively the impact of Public Reporting (PR) on clinical outcomes, we carried out a systematic review of published studies on this topic. Pubmed, Web of Science and SCOPUS databases were searched to identify studies published from 1991 to 2014 that investigated the relationship between PR and clinical outcomes. Studies were considered eligible if they investigated the relationship between PR and clinical outcomes and comprehensively described the PR mechanism and the study design adopted. Among the clinical outcomes identified, meta-analysis was performed for overall mortality rate which quantitative data were exhaustively reported in a sufficient number of studies. Two reviewers conducted all data extraction independently and disagreements were resolved through discussion. The same reviewers evaluated also the quality of the studies using a GRADE approach. Twenty-seven studies were included. Mainly, the effect of PR on clinical outcomes was positive. Meta-analysis regarding overall mortality included, in a context of high heterogeneity, 10 studies with a total of 1,840,401 experimental events and 3,670,446 control events and resulted in a RR of 0.85 (95 % CI, 0.79-0.92). The introduction of PR programs at different levels of the healthcare sector is a challenging but rewarding public health strategy. Existing research covering different clinical outcomes supports the idea that PR could, in fact, stimulate providers to improve healthcare quality.
Peddinti, Divyaswetha; Nanduri, Bindu; Kaya, Abdullah; Feugang, Jean M; Burgess, Shane C; Memili, Erdogan
2008-02-22
Male infertility is a major problem for mammalian reproduction. However, molecular details including the underlying mechanisms of male fertility are still not known. A thorough understanding of these mechanisms is essential for obtaining consistently high reproductive efficiency and to ensure lower cost and time-loss by breeder. Using high and low fertility bull spermatozoa, here we employed differential detergent fractionation multidimensional protein identification technology (DDF-Mud PIT) and identified 125 putative biomarkers of fertility. We next used quantitative Systems Biology modeling and canonical protein interaction pathways and networks to show that high fertility spermatozoa differ from low fertility spermatozoa in four main ways. Compared to sperm from low fertility bulls, sperm from high fertility bulls have higher expression of proteins involved in: energy metabolism, cell communication, spermatogenesis, and cell motility. Our data also suggests a hypothesis that low fertility sperm DNA integrity may be compromised because cell cycle: G2/M DNA damage checkpoint regulation was most significant signaling pathway identified in low fertility spermatozoa. This is the first comprehensive description of the bovine spermatozoa proteome. Comparative proteomic analysis of high fertility and low fertility bulls, in the context of protein interaction networks identified putative molecular markers associated with high fertility phenotype.
Alsanosy, Rashad Mohammed; Mahfouz, Mohamed Salih; Gaffar, Abdelrahim Mutwakel
2013-01-01
Objectives. (1) To estimate the prevalence and behavioral patterns of Khat chewing and (2) to investigate factors that influenced the pattern of Khat use among undergraduate students in different higher education institutions in Jazan region, Saudi Arabia. Materials and Methods. A cross-sectional study using a pretested structured self-administered quantitative questionnaire was used to collect data. SPSS version 17 software program was used for data analysis. Results. The overall current Khat chewing prevalence among higher education students was found to be 23.1%, significantly higher among males at 38.5% than among females at 2.1% (P < 0.001). Lifetime Khat chewer students were 24.8% for males at 40.5%, significantly higher compared with females at 3.7% (P < 0.001). Univariate analysis revealed that the gender of student, smoking status of student, a friend's smoking, and Khat chewing were associated with a significant high risk of Khat chewing (P < 0.001 for all). Conclusions. The use of Khat trend is increasing among higher education students in Jazan region. A multilevel, value based, comprehensive, and strategic long-term intervention plan is needed. The comprehensive plan may include social interventions geared by creating recreations alternatives and opportunities for youth and a critical review for current authorities' interventions and services. PMID:23878809
Peddinti, Divyaswetha; Nanduri, Bindu; Kaya, Abdullah; Feugang, Jean M; Burgess, Shane C; Memili, Erdogan
2008-01-01
Background Male infertility is a major problem for mammalian reproduction. However, molecular details including the underlying mechanisms of male fertility are still not known. A thorough understanding of these mechanisms is essential for obtaining consistently high reproductive efficiency and to ensure lower cost and time-loss by breeder. Results Using high and low fertility bull spermatozoa, here we employed differential detergent fractionation multidimensional protein identification technology (DDF-Mud PIT) and identified 125 putative biomarkers of fertility. We next used quantitative Systems Biology modeling and canonical protein interaction pathways and networks to show that high fertility spermatozoa differ from low fertility spermatozoa in four main ways. Compared to sperm from low fertility bulls, sperm from high fertility bulls have higher expression of proteins involved in: energy metabolism, cell communication, spermatogenesis, and cell motility. Our data also suggests a hypothesis that low fertility sperm DNA integrity may be compromised because cell cycle: G2/M DNA damage checkpoint regulation was most significant signaling pathway identified in low fertility spermatozoa. Conclusion This is the first comprehensive description of the bovine spermatozoa proteome. Comparative proteomic analysis of high fertility and low fertility bulls, in the context of protein interaction networks identified putative molecular markers associated with high fertility phenotype. PMID:18294385
Zhang, Jin; Liu, Bobin; Li, Jianbo; Zhang, Li; Wang, Yan; Zheng, Huanquan; Lu, Mengzhu; Chen, Jun
2015-03-14
Heat shock proteins (Hsps) are molecular chaperones that are involved in many normal cellular processes and stress responses, and heat shock factors (Hsfs) are the transcriptional activators of Hsps. Hsfs and Hsps are widely coordinated in various biological processes. Although the roles of Hsfs and Hsps in stress responses have been well characterized in Arabidopsis, their roles in perennial woody species undergoing various environmental stresses remain unclear. Here, a comprehensive identification and analysis of Hsf and Hsp families in poplars is presented. In Populus trichocarpa, we identified 42 paralogous pairs, 66.7% resulting from a whole genome duplication. The gene structure and motif composition are relatively conserved in each subfamily. Microarray and quantitative real-time RT-PCR analyses showed that most of the Populus Hsf and Hsp genes are differentially expressed upon exposure to various stresses. A coexpression network between Populus Hsf and Hsp genes was generated based on their expression. Coordinated relationships were validated by transient overexpression and subsequent qPCR analyses. The comprehensive analysis indicates that different sets of PtHsps are downstream of particular PtHsfs and provides a basis for functional studies aimed at revealing the roles of these families in poplar development and stress responses.
Burmester, Bridget; Leathem, Janet; Merrick, Paul
2016-12-01
Research investigating how subjective cognitive complaints (SCCs) might reliably indicate impairments in objective cognitive functioning has produced highly varied findings, and despite attempts to synthesise this literature (e.g., Jonker et al. International Journal of Geriatric Psychiatry, 15, 983-991, 2000; Reid and MacLullich Dementia and Geriatric Cognitive Disorders, 22(5-6), 471-485, 2006; Crumley et al. Psychology and Aging, 29(2), 250-263, 2014), recent work continues to offer little resolution. This review provides both quantitative and qualitative synthesis of research conducted since the last comprehensive review in 2006, with the aim of identifying reasons for these discrepancies that might provide fruitful avenues for future exploration. Meta-analysis found a small but significant association between SCCs and objective cognitive function, although it was limited by large heterogeneity between studies and evidence of potential publication bias. Often, assessments of SCCs and objective cognitive function were brief or not formally validated. However, studies that employed more comprehensive SCC measures tended to find that SCCs were associated independently with both objective cognitive function and depressive symptoms. Further explicit investigation of how assessment measures relate to reports of SCCs, and the validity of the proposed 'compensation theory' of SCC aetiology, is recommended.
Diversity of rationality affects the evolution of cooperation
NASA Astrophysics Data System (ADS)
Chen, Yu-Zhong; Huang, Zi-Gang; Wang, Sheng-Jun; Zhang, Yan; Wang, Ying-Hai
2009-05-01
By modifying the Fermi updating rule, we present the diversity of individual rationality to the evolutionary prisoner’s dilemma game, and our results shows that this diversity heavily influences the evolution of cooperation. Cluster-forming mechanism of cooperators can either be highly enhanced or severely deteriorated by different distributions of rationality. Slight change in the rationality distribution may transfer the whole system from the global absorbing state of cooperators to that of defectors. Based on mean-field argument, quantitative analysis of the stability of cooperative clusters reveals the critical role played by agents with moderate degree values in the evolution of the whole system. The inspiration from our work may provide us a deeper comprehension toward some social phenomena.
Nanosecond laser coloration on stainless steel surface.
Lu, Yan; Shi, Xinying; Huang, Zhongjia; Li, Taohai; Zhang, Meng; Czajkowski, Jakub; Fabritius, Tapio; Huttula, Marko; Cao, Wei
2017-08-02
In this work, we present laser coloration on 304 stainless steel using nanosecond laser. Surface modifications are tuned by adjusting laser parameters of scanning speed, repetition rate, and pulse width. A comprehensive study of the physical mechanism leading to the appearance is presented. Microscopic patterns are measured and employed as input to simulate light-matter interferences, while chemical states and crystal structures of composites to figure out intrinsic colors. Quantitative analysis clarifies the final colors and RGB values are the combinations of structural colors and intrinsic colors from the oxidized pigments, with the latter dominating. Therefore, the engineering and scientific insights of nanosecond laser coloration highlight large-scale utilization of the present route for colorful and resistant steels.
Three-dimensional Bragg coherent diffraction imaging of an extended ZnO crystal.
Huang, Xiaojing; Harder, Ross; Leake, Steven; Clark, Jesse; Robinson, Ian
2012-08-01
A complex three-dimensional quantitative image of an extended zinc oxide (ZnO) crystal has been obtained using Bragg coherent diffraction imaging integrated with ptychography. By scanning a 2.5 µm-long arm of a ZnO tetrapod across a 1.3 µm X-ray beam with fine step sizes while measuring a three-dimensional diffraction pattern at each scan spot, the three-dimensional electron density and projected displacement field of the entire crystal were recovered. The simultaneously reconstructed complex wavefront of the illumination combined with its coherence properties determined by a partial coherence analysis implemented in the reconstruction process provide a comprehensive characterization of the incident X-ray beam.
Krüger, Angela V; Jelier, Rob; Dzyubachyk, Oleh; Zimmerman, Timo; Meijering, Erik; Lehner, Ben
2015-02-15
Chromatin regulators are widely expressed proteins with diverse roles in gene expression, nuclear organization, cell cycle regulation, pluripotency, physiology and development, and are frequently mutated in human diseases such as cancer. Their inhibition often results in pleiotropic effects that are difficult to study using conventional approaches. We have developed a semi-automated nuclear tracking algorithm to quantify the divisions, movements and positions of all nuclei during the early development of Caenorhabditis elegans and have used it to systematically study the effects of inhibiting chromatin regulators. The resulting high dimensional datasets revealed that inhibition of multiple regulators, including F55A3.3 (encoding FACT subunit SUPT16H), lin-53 (RBBP4/7), rba-1 (RBBP4/7), set-16 (MLL2/3), hda-1 (HDAC1/2), swsn-7 (ARID2), and let-526 (ARID1A/1B) affected cell cycle progression and caused chromosome segregation defects. In contrast, inhibition of cir-1 (CIR1) accelerated cell division timing in specific cells of the AB lineage. The inhibition of RNA polymerase II also accelerated these division timings, suggesting that normal gene expression is required to delay cell cycle progression in multiple lineages in the early embryo. Quantitative analyses of the dataset suggested the existence of at least two functionally distinct SWI/SNF chromatin remodeling complex activities in the early embryo, and identified a redundant requirement for the egl-27 and lin-40 MTA orthologs in the development of endoderm and mesoderm lineages. Moreover, our dataset also revealed a characteristic rearrangement of chromatin to the nuclear periphery upon the inhibition of multiple general regulators of gene expression. Our systematic, comprehensive and quantitative datasets illustrate the power of single cell-resolution quantitative tracking and high dimensional phenotyping to investigate gene function. Furthermore, the results provide an overview of the functions of essential chromatin regulators during the early development of an animal. Copyright © 2014 Elsevier Inc. All rights reserved.
Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E
2014-01-01
This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Racism as a determinant of health: a protocol for conducting a systematic review and meta-analysis
2013-01-01
Background Racism is increasingly recognized as a key determinant of health. A growing body of epidemiological evidence shows strong associations between self-reported racism and poor health outcomes across diverse minority groups in developed countries. While the relationship between racism and health has received increasing attention over the last two decades, a comprehensive meta-analysis focused on the health effects of racism has yet to be conducted. The aim of this review protocol is to provide a structure from which to conduct a systematic review and meta-analysis of studies that assess the relationship between racism and health. Methods This research will consist of a systematic review and meta-analysis. Studies will be considered for review if they are empirical studies reporting quantitative data on the association between racism and health for adults and/or children of all ages from any racial/ethnic/cultural groups. Outcome measures will include general health and well-being, physical health, mental health, healthcare use and health behaviors. Scientific databases (for example, Medline) will be searched using a comprehensive search strategy and reference lists will be manually searched for relevant studies. In addition, use of online search engines (for example, Google Scholar), key websites, and personal contact with experts will also be undertaken. Screening of search results and extraction of data from included studies will be independently conducted by at least two authors, including assessment of inter-rater reliability. Studies included in the review will be appraised for quality using tools tailored to each study design. Summary statistics of study characteristics and findings will be compiled and findings synthesized in a narrative summary as well as a meta-analysis. Discussion This review aims to examine associations between reported racism and health outcomes. This comprehensive and systematic review and meta-analysis of empirical research will provide a rigorous and reliable evidence base for future research, policy and practice, including information on the extent of available evidence for a range of racial/ethnic minority groups PMID:24059279
Rethinking vulnerability analysis and governance with emphasis on a participatory approach.
Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel
2015-01-01
This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.
Investigation and comprehensive evaluation of the litter pollution on the Heishijiao beach in Dalian
NASA Astrophysics Data System (ADS)
Han, Mengdi; Zhao, Kaiyuan; Zhang, Yan; Sui, Chuanguo
2018-02-01
From November 2015 to August 2016, this paper conducted an investigation into the classification of the litter on the Heishijiao beach in Dalian, and made a comprehensive evaluation of the litter pollution on the beach in different seasons. According to the results, the litter on the Heishijiao beach in Dalian mainly come from human’s offshore activities and other wastes, and spring is the season which witnesses the largest quantity of litter resulting from the activities. Most of the fragmental wastes are glass, plastic and paper, while there is a little metal, rubber and wooden products. On the Heishijiao beach, most of the fragmental litter are small, followed by medium and large ones; outsized wastes are rare. The quantitative density of litter is highest in winter (9.0items/m2), with the average quantitative density of 4.6 items/m2; the qualitative density of litter is highest in spring (8 g/m2), with the average qualitative density of 6.0 g/m2. The results of the comprehensive evaluation show that the litter pollution on the Heishijiao beach stays between “Average” and “Unsatisfactory”.
Quantitative developmental transcriptomes of the Mediterranean sea urchin Paracentrotus lividus.
Gildor, Tsvia; Malik, Assaf; Sher, Noa; Avraham, Linor; Ben-Tabou de-Leon, Smadar
2016-02-01
Embryonic development progresses through the timely activation of thousands of differentially activated genes. Quantitative developmental transcriptomes provide the means to relate global patterns of differentially expressed genes to the emerging body plans they generate. The sea urchin is one of the classic model systems for embryogenesis and the models of its developmental gene regulatory networks are of the most comprehensive of their kind. Thus, the sea urchin embryo is an excellent system for studies of its global developmental transcriptional profiles. Here we produced quantitative developmental transcriptomes of the sea urchin Paracentrotus lividus (P. lividus) at seven developmental stages from the fertilized egg to prism stage. We generated de-novo reference transcriptome and identified 29,817 genes that are expressed at this time period. We annotated and quantified gene expression at the different developmental stages and confirmed the reliability of the expression profiles by QPCR measurement of a subset of genes. The progression of embryo development is reflected in the observed global expression patterns and in our principle component analysis. Our study illuminates the rich patterns of gene expression that participate in sea urchin embryogenesis and provide an essential resource for further studies of the dynamic expression of P. lividus genes. Copyright © 2015 Elsevier B.V. All rights reserved.
Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.
Han, Xianlin; Yang, Kui; Gross, Richard W.
2011-01-01
Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525
NASA Astrophysics Data System (ADS)
Ruan, Wenzhi; Yan, Limei; He, Jiansen; Zhang, Lei; Wang, Linghua; Wei, Yong
2018-06-01
Shock waves are believed to play an important role in plasma heating. The shock-like temporal jumps in radiation intensity and Doppler shift have been identified in the solar atmosphere. However, a quantitative diagnosis of the shocks in the solar atmosphere is still lacking, seriously hindering the understanding of shock dissipative heating of the solar atmosphere. Here, we propose a new method to realize the goal of the shock quantitative diagnosis, based on Rankine–Hugoniot equations and taking the advantages of simultaneous imaging and spectroscopic observations from, e.g., IRIS (Interface Region Imaging Spectrograph). Because of this method, the key parameters of shock candidates can be derived, such as the bulk velocity and temperature of the plasma in the upstream and downstream, the propagation speed and direction. The method is applied to the shock candidates observed by IRIS, and the overall characteristics of the shocks are revealed quantitatively for the first time. This method is also tested with the help of forward modeling, i.e., virtual observations of simulated shocks. The parameters obtained from the method are consistent with the parameters of the shock formed in the model and are independent of the viewing direction. Therefore, the method we proposed here is applicable to the quantitative and comprehensive diagnosis of the observed shocks in the solar atmosphere.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1985-09-01
This comprehensive report outlines the progress made during the past three years in the areas described below. In all instances, initial studies have been carried out and the technical feasibility of carrying through each study has been demonstrated. The studies described include development of cesium-130 and bromine-75 radioisotope generators, the feasibility of using rubidium-82 as a myocardial imaging agent, and radiochemical preparation of C-11 deoxyglucose. 28 refs. (DT)
Kiefl, Johannes; Cordero, Chiara; Nicolotti, Luca; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2012-06-22
The continuous interest in non-targeted profiling induced the development of tools for automated cross-sample analysis. Such tools were found to be selective or not comprehensive thus delivering a biased view on the qualitative/quantitative peak distribution across 2D sample chromatograms. Therefore, the performance of non-targeted approaches needs to be critically evaluated. This study focused on the development of a validation procedure for non-targeted, peak-based, GC×GC-MS data profiling. The procedure introduced performance parameters such as specificity, precision, accuracy, and uncertainty for a profiling method known as Comprehensive Template Matching. The performance was assessed by applying a three-week validation protocol based on CITAC/EURACHEM guidelines. Optimized ¹D and ²D retention times search windows, MS match factor threshold, detection threshold, and template threshold were evolved from two training sets by a semi-automated learning process. The effectiveness of proposed settings to consistently match 2D peak patterns was established by evaluating the rate of mismatched peaks and was expressed in terms of results accuracy. The study utilized 23 different 2D peak patterns providing the chemical fingerprints of raw and roasted hazelnuts (Corylus avellana L.) from different geographical origins, of diverse varieties and different roasting degrees. The validation results show that non-targeted peak-based profiling can be reliable with error rates lower than 10% independent of the degree of analytical variance. The optimized Comprehensive Template Matching procedure was employed to study hazelnut roasting profiles and in particular to find marker compounds strongly dependent on the thermal treatment, and to establish the correlation of potential marker compounds to geographical origin and variety/cultivar and finally to reveal the characteristic release of aroma active compounds. Copyright © 2012 Elsevier B.V. All rights reserved.
Costenoble, Roeland; Picotti, Paola; Reiter, Lukas; Stallmach, Robert; Heinemann, Matthias; Sauer, Uwe; Aebersold, Ruedi
2011-01-01
Decades of biochemical research have identified most of the enzymes that catalyze metabolic reactions in the yeast Saccharomyces cerevisiae. The adaptation of metabolism to changing nutritional conditions, in contrast, is much less well understood. As an important stepping stone toward such understanding, we exploit the power of proteomics assays based on selected reaction monitoring (SRM) mass spectrometry to quantify abundance changes of the 228 proteins that constitute the central carbon and amino-acid metabolic network in the yeast Saccharomyces cerevisiae, at five different metabolic steady states. Overall, 90% of the targeted proteins, including families of isoenzymes, were consistently detected and quantified in each sample, generating a proteomic data set that represents a nutritionally perturbed biological system at high reproducibility. The data set is near comprehensive because we detect 95–99% of all proteins that are required under a given condition. Interpreted through flux balance modeling, the data indicate that S. cerevisiae retains proteins not necessarily used in a particular environment. Further, the data suggest differential functionality for several metabolic isoenzymes. PMID:21283140
Quantitative proteomic analysis in breast cancer.
Tabchy, A; Hennessy, B T; Gonzalez-Angulo, A M; Bernstam, F M; Lu, Y; Mills, G B
2011-02-01
Much progress has recently been made in the genomic and transcriptional characterization of tumors. However, historically the characterization of cells at the protein level has suffered limitations in reproducibility, scalability and robustness. Recent technological advances have made it possible to accurately and reproducibly portray the global levels and active states of cellular proteins. Protein microarrays examine the native post-translational conformations of proteins including activated phosphorylated states, in a comprehensive high-throughput mode, and can map activated pathways and networks of proteins inside the cells. The reverse-phase protein microarray (RPPA) offers a unique opportunity to study signal transduction networks in small biological samples such as human biopsy material and can provide critical information for therapeutic decision-making and the monitoring of patients for targeted molecular medicine. By providing the key missing link to the story generated from genomic and gene expression characterization efforts, functional proteomics offer the promise of a comprehensive understanding of cancer. Several initial successes in breast cancer are showing that such information is clinically relevant. Copyright 2011 Prous Science, S.A.U. or its licensors. All rights reserved.
NASA Astrophysics Data System (ADS)
Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun
2018-01-01
With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.
In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less
Clean Indoor Air Ordinance Coverage in the Appalachian Region of the United States
Liber, Alex; Pennell, Michael; Nealy, Darren; Hammer, Jana; Berman, Micah
2010-01-01
Objectives. We sought to quantitatively examine the pattern of, and socioeconomic factors associated with, adoption of clean indoor air ordinances in Appalachia. Methods. We collected and reviewed clean indoor air ordinances in Appalachian communities in 6 states and rated the ordinances for completeness of coverage in workplaces, restaurants, and bars. Additionally, we computed a strength score to measure coverage in 7 locations. We fit mixed-effects models to determine whether the presence of a comprehensive ordinance and the ordinance strength were related to community socioeconomic disadvantage. Results. Of the 332 communities included in the analysis, fewer than 20% had adopted a comprehensive workplace, restaurant, or bar ordinance. Most ordinances were weak, achieving on average only 43% of the total possible points. Communities with a higher unemployment rate were less likely and those with a higher education level were more likely to have a strong ordinance. Conclusions. The majority of residents in these communities are not protected from secondhand smoke. Efforts to pass strong statewide clean indoor air laws should take priority over local initiatives in these states. PMID:20466957
Occupational hazard evaluation model underground coal mine based on unascertained measurement theory
NASA Astrophysics Data System (ADS)
Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya
2017-05-01
In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.
López-Vallejo, Fabian; Nefzi, Adel; Bender, Andreas; Owen, John R.; Nabney, Ian T.; Houghten, Richard A.; Medina-Franco, Jose L.
2011-01-01
Combinatorial libraries continue to play a key role in drug discovery. To increase structural diversity, several experimental methods have been developed. However, limited efforts have been performed so far to quantify the diversity of the broadly used diversity-oriented synthetic (DOS) libraries. Herein we report a comprehensive characterization of 15 bis-diazacyclic combinatorial libraries obtained through libraries from libraries, which is a DOS approach. Using MACCS keys, radial and different pharmacophoric fingerprints as well as six molecular properties, it was demonstrated the increased structural and property diversity of the libraries from libraries over the individual libraries. Comparison of the libraries to existing drugs, NCI Diversity and the Molecular Libraries Small Molecule Repository revealed the structural uniqueness of the combinatorial libraries (mean similarity < 0.5 for any fingerprint representation). In particular, bis-cyclic thiourea libraries were the most structurally dissimilar to drugs retaining drug-like character in property space. This study represents the first comprehensive quantification of the diversity of libraries from libraries providing a solid quantitative approach to compare and contrast the diversity of DOS libraries with existing drugs or any other compound collection. PMID:21294850
Comprehensive analysis of Arabidopsis expression level polymorphisms with simple inheritance
Plantegenet, Stephanie; Weber, Johann; Goldstein, Darlene R; Zeller, Georg; Nussbaumer, Cindy; Thomas, Jérôme; Weigel, Detlef; Harshman, Keith; Hardtke, Christian S
2009-01-01
In Arabidopsis thaliana, gene expression level polymorphisms (ELPs) between natural accessions that exhibit simple, single locus inheritance are promising quantitative trait locus (QTL) candidates to explain phenotypic variability. It is assumed that such ELPs overwhelmingly represent regulatory element polymorphisms. However, comprehensive genome-wide analyses linking expression level, regulatory sequence and gene structure variation are missing, preventing definite verification of this assumption. Here, we analyzed ELPs observed between the Eil-0 and Lc-0 accessions. Compared with non-variable controls, 5′ regulatory sequence variation in the corresponding genes is indeed increased. However, ∼42% of all the ELP genes also carry major transcription unit deletions in one parent as revealed by genome tiling arrays, representing a >4-fold enrichment over controls. Within the subset of ELPs with simple inheritance, this proportion is even higher and deletions are generally more severe. Similar results were obtained from analyses of the Bay-0 and Sha accessions, using alternative technical approaches. Collectively, our results suggest that drastic structural changes are a major cause for ELPs with simple inheritance, corroborating experimentally observed indel preponderance in cloned Arabidopsis QTL. PMID:19225455
Takada, Junya; Honda, Norihiro; Hazama, Hisanao; Ioritani, Naomasa
2016-01-01
Background and Aims: Laser vaporization of the prostate is expected as a less invasive treatment for benign prostatic hyperplasia (BPH), via the photothermal effect. In order to develop safer and more effective laser vaporization of the prostate, it is essential to set optimal irradiation parameters based on quantitative evaluation of temperature distribution and thermally denatured depth in prostate tissue. Method: A simulation model was therefore devised with light propagation and heat transfer calculation, and the vaporized and thermally denatured depths were estimated by the simulation model. Results: The results of the simulation were compared with those of an ex vivo experiment and clinical trial. Based on the accumulated data, the vaporized depth strongly depended on the distance between the optical fiber and the prostate tissue, and it was suggested that contact laser irradiation could vaporize the prostate tissue most effectively. Additionally, it was suggested by analyzing thermally denatured depth comprehensively that laser irradiation at the distance of 3 mm between the optical fiber and the prostate tissue was useful for hemostasis. Conclusions: This study enabled quantitative and reproducible analysis of laser vaporization for BPH and will play a role in clarification of the safety and efficacy of this treatment. PMID:28765672
Smadi, Hanan; Sargeant, Jan M
2013-02-01
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Chen, Gang; Adleman, Nancy E.; Saad, Ziad S.; Leibenluft, Ellen; Cox, RobertW.
2014-01-01
All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance–covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within- subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT)with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse–Geisser and Huynh–Feldt) with MVT-WS. PMID:24954281
NASA Astrophysics Data System (ADS)
Lutz, Norbert W.; Bernard, Monique
2018-02-01
We recently suggested a new paradigm for statistical analysis of thermal heterogeneity in (semi-)aqueous materials by 1H NMR spectroscopy, using water as a temperature probe. Here, we present a comprehensive in silico and in vitro validation that demonstrates the ability of this new technique to provide accurate quantitative parameters characterizing the statistical distribution of temperature values in a volume of (semi-)aqueous matter. First, line shape parameters of numerically simulated water 1H NMR spectra are systematically varied to study a range of mathematically well-defined temperature distributions. Then, corresponding models based on measured 1H NMR spectra of agarose gel are analyzed. In addition, dedicated samples based on hydrogels or biological tissue are designed to produce temperature gradients changing over time, and dynamic NMR spectroscopy is employed to analyze the resulting temperature profiles at sub-second temporal resolution. Accuracy and consistency of the previously introduced statistical descriptors of temperature heterogeneity are determined: weighted median and mean temperature, standard deviation, temperature range, temperature mode(s), kurtosis, skewness, entropy, and relative areas under temperature curves. Potential and limitations of this method for quantitative analysis of thermal heterogeneity in (semi-)aqueous materials are discussed in view of prospective applications in materials science as well as biology and medicine.
NASA Astrophysics Data System (ADS)
Cai, J.; Yin, H.; Varis, O.
2016-12-01
China faces a complicated puzzle in balancing the country's trade-offs among water and energy security, economic competitiveness, and environmental sustainability. It is therefore of prime importance to comprehend China's water and energy security under the effect of its economic structural changes. Analyses on this entity still remain few and far between though, and a comprehensive picture has not been available that would help understand China's recent development in economic structure as well as its spatial features and links to water and energy security, and policy-making. Consequently, we addressed this information gap by performing an integrated and quantitative spatio-temporal analysis of the impacts of China's industrial transition on water use intensity (WUI) and energy-related carbon intensity (ERCI). Those two factors serve as the national targets of its water and energy security. Our results for the first time quantitatively demonstrated the following significant and novel information: 1) the primary industry (PI) appeared to dominate the WUI although its relative share decreased, and PI's WUI continued to be far higher than that of secondary and tertiary industries (SI and TI); 2) SI dominated in affecting the total ERCI at both national and provincial scales; 3) the total WUI and ERCI had a significant positive correlation.
Pei, Fen; Li, Hongchun; Henderson, Mark J; Titus, Steven A; Jadhav, Ajit; Simeonov, Anton; Cobanoglu, Murat Can; Mousavi, Seyed H; Shun, Tongying; McDermott, Lee; Iyer, Prema; Fioravanti, Michael; Carlisle, Diane; Friedlander, Robert M; Bahar, Ivet; Taylor, D Lansing; Lezon, Timothy R; Stern, Andrew M; Schurdak, Mark E
2017-12-19
Quantitative Systems Pharmacology (QSP) is a drug discovery approach that integrates computational and experimental methods in an iterative way to gain a comprehensive, unbiased understanding of disease processes to inform effective therapeutic strategies. We report the implementation of QSP to Huntington's Disease, with the application of a chemogenomics platform to identify strategies to protect neuronal cells from mutant huntingtin induced death. Using the STHdh Q111 cell model, we investigated the protective effects of small molecule probes having diverse canonical modes-of-action to infer pathways of neuronal cell protection connected to drug mechanism. Several mechanistically diverse protective probes were identified, most of which showed less than 50% efficacy. Specific combinations of these probes were synergistic in enhancing efficacy. Computational analysis of these probes revealed a convergence of pathways indicating activation of PKA. Analysis of phospho-PKA levels showed lower cytoplasmic levels in STHdh Q111 cells compared to wild type STHdh Q7 cells, and these levels were increased by several of the protective compounds. Pharmacological inhibition of PKA activity reduced protection supporting the hypothesis that protection may be working, in part, through activation of the PKA network. The systems-level studies described here can be broadly applied to any discovery strategy involving small molecule modulation of disease phenotype.
Herrmann, Luise; Haase, Ilka; Blauhut, Maike; Barz, Nadine; Fischer, Markus
2014-12-17
Two cocoa types, Arriba and CCN-51, are being cultivated in Ecuador. With regard to the unique aroma, Arriba is considered a fine cocoa type, while CCN-51 is a bulk cocoa because of its weaker aroma. Because it is being assumed that Arriba is mixed with CCN-51, there is an interest in the analytical differentiation of the two types. Two methods to identify CCN-51 adulterations in Arriba cocoa were developed on the basis of differences in the chloroplast DNA. On the one hand, a different repeat of the sequence TAAAG in the inverted repeat region results in a different length of amplicons for the two cocoa types, which can be detected by agarose gel electrophoresis, capillary gel electrophoresis, and denaturing high-performance liquid chromatography. On the other hand, single nucleotide polymorphisms (SNPs) between the CCN-51 and Arriba sequences represent restriction sites, which can be used for restriction fragment length polymorphism analysis. A semi-quantitative analysis based on these SNPs is feasible. A method for an exact quantitation based on these results is not realizable. These sequence variations were confirmed for a comprehensive cultivar collection of Arriba and CCN-51, for both bean and leaf samples.
Full-range public health leadership, part 1: quantitative analysis.
Carlton, Erik L; Holsinger, James W; Riddell, Martha; Bush, Heather
2015-01-01
Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership in local health departments. Leadership styles among local health department directors (n = 13) were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Holistic leadership assessment instruments such as the multifactor leadership questionnaire can be useful in assessing public health leaders' approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.
Wallace, Neal T; McConnell, K John
2013-10-01
This study assessed the impact of Oregon's 2007 parity law, which required behavioral health insurance parity, on rates of follow-up care provided within 30 days of psychiatric inpatient care. Data sources were claims (2005-2008) for 737 individuals with inpatient stays for a mental disorder who were continuously enrolled in insurance plans affected by the parity law (intervention group) or in commercial, self-insured plans that were not affected by the law (control group). A difference-in-difference analysis was used to compare rates of follow-up care before and after the parity law between discharges of individuals in the intervention group and the control group and between discharges of individuals in the intervention group who had or had not met preparity quantitative coverage limits during a coverage year. Estimates of the marginal effects of the parity law were adjusted for gender, discharge diagnosis, relationship to policy holder, and calendar quarter of discharge. The study included 353 discharges in the intervention group and 535 discharges in the control group. After the parity law, follow-up rates increased by 11% (p=.042) overall and by 20% for discharges of individuals who had met coverage limits (p=.028). The Oregon parity law was associated with a large increase in the rate of follow-up care, predominantly for discharges of individuals who had met preparity quantitative coverage limits. Given similarities between the law and the 2008 Mental Health Parity and Addiction Equity Act, the results may portend a national effect of more comprehensive parity laws.
Frommenwiler, Débora Arruda; Kim, Jonghwan; Yook, Chang-Soo; Tran, Thi Thu Trang; Cañigueral, Salvador; Reich, Eike
2018-04-01
The quality of herbal drugs is usually controlled using several tests recommended in a monograph. HPTLC is the method of choice for identification in many pharmacopoeias. If combined with a suitable reference material for comparison, HPTLC can provide information beyond identification and thus may simplify quality control. This paper describes, as a proof of concept, how HPTLC can be applied to define specifications for an herbal reference material and to control the quality of an herbal drug according to these specifications. Based on multiple batches of cultivated Angelica gigas root, a specific HPTLC method for identification was optimized. This method can distinguish 27 related species. It also can detect the presence of mixtures of A. gigas with two other Angelica species traded as "Dang gui" and is suitable as well for quantitative assessment of samples in a test for minimum content of the sum of decursin and decursinol angelate. The new concept of "comprehensive HPTLC fingerprinting" is proposed: HPTLC fingerprints (images), which are used for identification, are converted into peak profiles and the intensities of selected zones are quantitatively compared to those of the corresponding zones of the reference material. Following a collaborative trial involving three laboratories in three countries, the method was applied to check the quality of further candidates for establishing an appropriate reference material. In conclusion, this case demonstrates that a single HPTLC analysis can provide information about identity, purity, and minimum content of markers of an herbal drug. Georg Thieme Verlag KG Stuttgart · New York.
Schmid, W; Rosland, J H; von Hofacker, S; Hunskår, I; Bruvik, F
2018-02-20
The use of music as therapy in multidisciplinary end-of-life care dates back to the 1970s and nowadays music therapy (MT) is one of the most frequently used complementary therapy in in-patient palliative care in the US. However existing research investigated music therapy's potential impact mainly from one perspective, referring to either a quantitative or qualitative paradigm. The aim of this review is to provide an overview of the users' and providers' perspectives on music therapy in palliative care within one research article. A systematic literature search was conducted using several databases supplemented with a hand-search of journals between November 1978 and December 2016. Inclusion criteria were: Music therapy with adults in palliative care conducted by a certified music therapist. Both quantitative and qualitative studies in English, German or a Scandinavian language published in peer reviewed journals were included. We aimed to identify and discuss the perspectives of both patients and health care providers on music therapy's impact in palliative care to forward a comprehensive understanding of it's effectiveness, benefits and limitations. We investigated themes mentioned by patients within qualitative studies, as well as commonly chosen outcome measures in quantitative research. A qualitative approach utilizing inductive content analysis was carried out to analyze and categorize the data. Twelve articles, reporting on nine quantitative and three qualitative research studies were included. Seven out of the nine quantitative studies investigated pain as an outcome. All of the included quantitative studies reported positive effects of the music therapy. Patients themselves associated MT with the expression of positive as well as challenging emotions and increased well-being. An overarching theme in both types of research is a psycho-physiological change through music therapy. Both quantitative as well as qualitative research showed positive changes in psycho-physiological well-being. The integration of the users´ and providers´ perspectives within future research applicable for example in mixed-methods designs is recommended.
Netzel, Pawel
2017-01-01
The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862
Witt, Kristine L; Hsieh, Jui-Hua; Smith-Roe, Stephanie L; Xia, Menghang; Huang, Ruili; Zhao, Jinghua; Auerbach, Scott S; Hur, Junguk; Tice, Raymond R
2017-08-01
Genotoxicity potential is a critical component of any comprehensive toxicological profile. Compounds that induce DNA or chromosomal damage often activate p53, a transcription factor essential to cell cycle regulation. Thus, within the US Tox21 Program, we screened a library of ∼10,000 (∼8,300 unique) environmental compounds and drugs for activation of the p53-signaling pathway using a quantitative high-throughput screening assay employing HCT-116 cells (p53 +/+ ) containing a stably integrated β-lactamase reporter gene under control of the p53 response element (p53RE). Cells were exposed (-S9) for 16 hr at 15 concentrations (generally 1.2 nM to 92 μM) three times, independently. Excluding compounds that failed analytical chemistry analysis or were suspected of inducing assay interference, 365 (4.7%) of 7,849 unique compounds were concluded to activate p53. As part of an in-depth characterization of our results, we first compared them with results from traditional in vitro genotoxicity assays (bacterial mutation, chromosomal aberration); ∼15% of known, direct-acting genotoxicants in our library activated the p53RE. Mining the Comparative Toxicogenomics Database revealed that these p53 actives were significantly associated with increased expression of p53 downstream genes involved in DNA damage responses. Furthermore, 53 chemical substructures associated with genotoxicity were enriched in certain classes of p53 actives, for example, anthracyclines (antineoplastics) and vinca alkaloids (tubulin disruptors). Interestingly, the tubulin disruptors manifested unusual nonmonotonic concentration response curves suggesting activity through a unique p53 regulatory mechanism. Through the analysis of our results, we aim to define a role for this assay as one component of a comprehensive toxicological characterization of large compound libraries. Environ. Mol. Mutagen. 58:494-507, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Moloi, Annesinah Hlengiwe; Watkins, David; Engel, Mark E; Mall, Sumaya; Zühlke, Liesl
2016-01-01
Introduction Rheumatic heart disease (RHD) is a chronic disease affecting the heart valves, secondary to group A streptococcal infection (GAS) and subsequent acute rheumatic fever (ARF). However, RHD cure and preventative measures are inextricably linked with socioeconomic development, as the disease mainly affects children and young adults living in poverty. In order to address RHD, public health officials and health policymakers require up-to-date knowledge on the epidemiology of GAS, ARF and RHD, as well as the existing enablers and gaps in delivery of evidence-based care for these conditions. We propose to conduct a systematic review to assess the literature comprehensively, synthesising all existing quantitative and qualitative data relating to RHD in Africa. Methods and analysis We plan to conduct a comprehensive literature search using a number of databases and reference lists of relevant articles published from January 1995 to December 2015. Two evaluators will independently review and extract data from each article. Additionally, we will assess overall study quality and risk of bias, using the Newcastle-Ottawa Scale and the Critical Appraisal Skills Programme criteria for quantitative and qualitative studies, respectively. We will meta-analyse estimates of prevalence, incidence, case fatality and mortality for each of the conditions separately for each country. Qualitative meta-analysis will be conducted for facilitators and barriers in RHD health access. Lastly, we will create a list of key stakeholders. This protocol is registered in the PROSPERO International Prospective Register of systematic reviews, registration number CRD42016032852. Ethics and dissemination The information provided by this review will inform and assist relevant stakeholders in identifying key areas of intervention, and designing and implementing evidence-based programmes and policies at the local and regional level. With slight modifications (ie, to the country terms in the search strategy), this protocol can be used as part of a needs assessment in any endemic country. PMID:27207627
NMR-based Metabolomics for Cancer Research
Metabolomics is considered as a complementary tool to other omics platforms to provide a snapshot of the cellular biochemistry and physiology taking place at any instant. Metabolmics approaches have been widely used to provide comprehensive and quantitative analyses of the metabo...
DOT National Transportation Integrated Search
1996-10-01
This document presents quantitative national estimates of the magnitude of transportations impacts on the natural environment. It is the most comprehensive compilation of environmental and transportation data to date. This document addresses all p...
Environmental Sustainability - Including Land and Water Use
Assessments of environmental sustainability can be conducted in many ways with one of the most quantitative methods including Life Cycle Impact Assessment (LCIA). While historically LCIA has included a comprehensive list of impact categories including: ozone depletion, global c...
Petriwskyj, Andrea; Parker, Deborah; O'Dwyer, Siobhan; Moyle, Wendy; Nucifora, Nikki
2016-06-01
Recent studies have indicated that family caregivers of people with dementia have higher rates of depression, anxiety and hopelessness, as well as higher levels of burden, stress and distress. Not all caregivers, however, succumb to the negative effects of caring. Caregivers who are able to recover from, resist or adapt to the physical and psychological demands of caring can be considered "resilient". The objective of this review was to examine the existing evidence regarding interventions for building resilience in family caregivers of people living with dementia. This review considered studies that included family caregivers of people with dementia. Studies investigating interventions to build resilience in family caregivers were considered by the review. For qualitative studies, the phenomena of interest were family caregivers' experiences of the interventions including factors affecting implementation and their subjective experience of outcomes. Studies conducted in any cultural or geographical context and any settings including participants' homes in the community, residential aged care or hospital, medical or allied health practice were considered for inclusion. Quantitative studies incorporating experimental and descriptive study designs and qualitative studies, including, but not limited to, phenomenology, grounded theory, ethnography, action research and feminist research were considered for inclusion. Quantitative studies were included that contained either objective or subjective outcome measures (or a combination of both). In cases in which proxy measures of resilience were used, only those papers that explicitly related the aims of the intervention and the measurement of outcomes to resilience itself were considered for inclusion. Proxies could include, but were not limited to, self-efficacy, locus of control, perceived burden, psychological wellbeing, strength, coping, positive adjustment and resourcefulness. Qualitative studies were similarly considered for inclusion if they explicitly related the aims of the intervention to resilience. Eleven electronic databases were searched for research studies published in English in or after 1990. Quantitative and qualitative studies selected for retrieval were assessed by two independent reviewers for methodological validity using standardized critical appraisal instruments from the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI) and Joanna Briggs Institute Qualitative Assessment and Review Instrument (JBI-QARI). Quantitative and qualitative data were extracted from publications included in the review using the standardized data extraction tools from JBI-MAStARI and JBI-QARI. It was not possible to pool quantitative findings for statistical meta-analysis using JBI-MAStARI. Qualitative research findings were too limited to be pooled using the JBI-QARI. The findings are presented in narrative form. The review included three publications reporting one quantitative intervention study and one mixed-method intervention study. There was a lack of available studies and, of the two intervention studies that were identified, neither found any statistically significant change in quantitative measures of resilience. Qualitative data suggested positive impacts of a poetry writing intervention and a positive experience of the intervention. The studies differed in both the nature of the intervention and the way resilience was conceptualized and operationalized. Consequently, it was not possible to offer any recommendations for practice. Implications for research relate to the development of a more comprehensive theory of resilience in family caregivers that can be used to develop and rigorously evaluate reliable and valid measures of resilience in line with that theory. Further, well-designed, sufficiently powered intervention studies informed by theory are needed.
NASA Astrophysics Data System (ADS)
Sarout, Joël.
2012-04-01
For the first time, a comprehensive and quantitative analysis of the domains of validity of popular wave propagation theories for porous/cracked media is provided. The case of a simple, yet versatile rock microstructure is detailed. The microstructural parameters controlling the applicability of the scattering theories, the effective medium theories, the quasi-static (Gassmann limit) and dynamic (inertial) poroelasticity are analysed in terms of pores/cracks characteristic size, geometry and connectivity. To this end, a new permeability model is devised combining the hydraulic radius and percolation concepts. The predictions of this model are compared to published micromechanical models of permeability for the limiting cases of capillary tubes and penny-shaped cracks. It is also compared to published experimental data on natural rocks in these limiting cases. It explicitly accounts for pore space topology around the percolation threshold and far above it. Thanks to this permeability model, the scattering, squirt-flow and Biot cut-off frequencies are quantitatively compared. This comparison leads to an explicit mapping of the domains of validity of these wave propagation theories as a function of the rock's actual microstructure. How this mapping impacts seismic, geophysical and ultrasonic wave velocity data interpretation is discussed. The methodology demonstrated here and the outcomes of this analysis are meant to constitute a quantitative guide for the selection of the most suitable modelling strategy to be employed for prediction and/or interpretation of rocks elastic properties in laboratory-or field-scale applications when information regarding the rock's microstructure is available.
Jiang, Xin; Xue, Yang; Zhou, Hongzhi; Li, Shouhong; Zhang, Zongmin; Hou, Rui; Ding, Yuxiang; Hu, Kaijin
2015-10-01
Reference genes are commonly used as a reliable approach to normalize the results of quantitative polymerase chain reaction (qPCR), and to reduce errors in the relative quantification of gene expression. Suitable reference genes belonging to numerous functional classes have been identified for various types of species and tissue. However, little is currently known regarding the most suitable reference genes for bone, specifically for the sheep mandibular condyle. Sheep are important for the study of human bone diseases, particularly for temporomandibular diseases. The present study aimed to identify a set of reference genes suitable for the normalization of qPCR data from the mandibular condyle of sheep. A total of 12 reference genes belonging to various functional classes were selected, and the expression stability of the reference genes was determined in both the normal and fractured area of the sheep mandibular condyle. RefFinder, which integrates the following currently available computational algorithms: geNorm, NormFinder, BestKeeper, and the comparative ΔCt method, was used to compare and rank the candidate reference genes. The results obtained from the four methods demonstrated a similar trend: RPL19, ACTB, and PGK1 were the most stably expressed reference genes in the sheep mandibular condyle. As determined by RefFinder comprehensive analysis, the results of the present study suggested that RPL19 is the most suitable reference gene for studies associated with the sheep mandibular condyle. In addition, ACTB and PGK1 may be considered suitable alternatives.
Measurements and Modeling of Nitric Oxide Formation in Counterflow, Premixed CH4/O2/N2 Flames
NASA Technical Reports Server (NTRS)
Thomsen, D. Douglas; Laurendeau, Normand M.
2000-01-01
Laser-induced fluorescence (LIF) measurements of NO concentration in a variety of CH4/O2/N2 flames are used to evaluate the chemical kinetics of NO formation. The analysis begins with previous measurements in flat, laminar, premixed CH4/O2/N2 flames stabilized on a water-cooled McKenna burner at pressures ranging from 1 to 14.6 atm, equivalence ratios from 0.5 to 1.6, and volumetric nitrogen/oxygen dilution ratios of 2.2, 3.1 and 3.76. These measured results are compared to predictions to determine the capabilities and limitations of the comprehensive kinetic mechanism developed by the Gas Research Institute (GRI), version 2.11. The model is shown to predict well the qualitative trends of NO formation in lean-premixed flames, while quantitatively underpredicting NO concentration by 30-50%. For rich flames, the model is unable to even qualitatively match the experimental results. These flames were found to be limited by low temperatures and an inability to separate the flame from the burner surface. In response to these limitations, a counterflow burner was designed for use in opposed premixed flame studies. A new LIF calibration technique was developed and applied to obtain quantitative measurements of NO concentration in laminar, counterflow premixed, CH4/O2/N2 flames at pressures ranging from 1 to 5.1 atm, equivalence ratios of 0.6 to 1.5, and an N2/O2 dilution ratio of 3.76. The counterflow premixed flame measurements are combined with measurements in burner-stabilized premixed flames and counterflow diffusion flames to build a comprehensive database for analysis of the GRI kinetic mechanism. Pathways, quantitative reaction path and sensitivity analyses are applied to the GRI mechanism for these flame conditions. The prompt NO mechanism is found to severely underpredict the amount of NO formed in rich premixed and nitrogen-diluted diffusion flames. This underprediction is traced to uncertainties in the CH kinetics as well as in the nitrogen oxidation chemistry. Suggestions are made which significantly improve the predictive capability of the GRI mechanism in near-stoichiometric, rich, premixed flames and in atmospheric-pressure, diffusion flames. However, the modified reaction mechanism is unable to model the formation of NO in ultra-rich, premixed or in high-pressure, nonpremixed flames, thus indicating the need for additional study under these conditions.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.
Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-08-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qibin; Monroe, Matthew E.; Schepmoes, Athena A.
Non-enzymatic glycation of proteins is implicated in diabetes mellitus and its related complications. In this report, we extend our previous development and refinement of proteomics-based methods for the analysis of non-enzymatically glycated proteins to comprehensively identify glycated proteins in normal and diabetic human plasma and erythrocytes. Using immunodepletion, enrichment, and fractionation strategies, we identified 7749 unique glycated peptides, corresponding to 3742 unique glycated proteins. Semi-quantitative comparisons revealed a number of proteins with glycation levels significantly increased in diabetes relative to control samples and that erythrocyte proteins are more extensively glycated than plasma proteins. A glycation motif analysis revealed amino acidsmore » that are favored more than others in the protein primary structures in the vicinity of the glycation sites in both sample types. The glycated peptides and corresponding proteins reported here provide a foundation for the potential identification of novel markers for diabetes, glycemia, or diabetic complications.« less
Life Cycle Energy Analysis of Reclaimed Water Reuse Projects in Beijing.
Fan, Yupeng; Guo, Erhui; Zhai, Yuanzheng; Chang, Andrew C; Qiao, Qi; Kang, Peng
2018-01-01
To illustrate the benefits of water reuse project, the process-based life cycle analysis (LCA) could be combined with input-output LCA to evaluate the water reuse project. Energy is the only evaluation parameter used in this study. Life cycle assessment of all energy inputs (LCEA) is completed mainly by the life cycle inventory (LCI), taking into account the full life cycle including the construction, the operation, and the demolition phase of the project. Assessment of benefit from water reuse during the life cycle should focus on wastewater discharge reduction and water-saving benefits. The results of LCEA of Beijing water reuse project built in 2014 in a comprehensive way shows that the benefits obtained from the reclaimed water reuse far exceed the life cycle energy consumption. In this paper, the authors apply the LCEA model to estimate the benefits of reclaimed water reuse projects quantitatively.
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Kleineke, V; Stamer, M; Zeisberger, M; Brandes, I; Meyer, T
2015-08-01
To determine if there is a difference between successful and less successful rehabilitation facilities concerning their extent and quality of interdisciplinary cooperation? This analysis is part of the project MeeR, that aims to identify characteristics of rehabilitation facilities related to successful rehabilitation. 6 facilities were recruited based on a quantitative analysis; 3 facilities that ranked as above average and 3 as below average in terms of their success in rehabilitating patients. Comprehensive qualitative data were collected on these 6 facilities. In above average rehabilitation facilities, the extent of interdisciplinary cooperation was higher than in below average facilities; the position of the medical profession was less dominant and there was a wider access to team meetings. Promotion of interdisciplinary cooperation is an important component for the improvement of the success of rehabilitation facilities. © Georg Thieme Verlag KG Stuttgart · New York.
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Zhang, Ying; Wang, Xi; Cui, Dan; Zhu, Jun
2016-12-01
Human whole saliva is a vital body fluid for studying the physiology and pathology of the oral cavity. As a powerful technique for biomarker discovery, MS-based proteomic strategies have been introduced for saliva analysis and identified hundreds of proteins and N-glycosylation sites. However, there is still a lack of quantitative analysis, which is necessary for biomarker screening and biological research. In this study, we establish an integrated workflow by the combination of stable isotope dimethyl labeling, HILIC enrichment, and high resolution MS for both quantification of the global proteome and N-glycoproteome of human saliva from oral ulcer patients. With the help of advanced bioinformatics, we comprehensively studied oral ulcers at both protein and glycoprotein scales. Bioinformatics analyses revealed that starch digestion and protein degradation activities are inhibited while the immune response is promoted in oral ulcer saliva. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tunable Single-Cell Extraction for Molecular Analyses.
Guillaume-Gentil, Orane; Grindberg, Rashel V; Kooger, Romain; Dorwling-Carter, Livie; Martinez, Vincent; Ossola, Dario; Pilhofer, Martin; Zambelli, Tomaso; Vorholt, Julia A
2016-07-14
Because of cellular heterogeneity, the analysis of endogenous molecules from single cells is of significant interest and has major implications. While micromanipulation or cell sorting followed by cell lysis is already used for subsequent molecular examinations, approaches to directly extract the content of living cells remain a challenging but promising alternative to achieving non-destructive sampling and cell-context preservation. Here, we demonstrate the quantitative extraction from single cells with spatiotemporal control using fluidic force microscopy. We further present a comprehensive analysis of the soluble molecules withdrawn from the cytoplasm or the nucleus, including the detection of enzyme activities and transcript abundances. This approach has uncovered the ability of cells to withstand extraction of up to several picoliters and opens opportunities to study cellular dynamics and cell-cell communication under physiological conditions at the single-cell level. Copyright © 2016 Elsevier Inc. All rights reserved.
Snapshot Hyperspectral Volumetric Microscopy
NASA Astrophysics Data System (ADS)
Wu, Jiamin; Xiong, Bo; Lin, Xing; He, Jijun; Suo, Jinli; Dai, Qionghai
2016-04-01
The comprehensive analysis of biological specimens brings about the demand for capturing the spatial, temporal and spectral dimensions of visual information together. However, such high-dimensional video acquisition faces major challenges in developing large data throughput and effective multiplexing techniques. Here, we report the snapshot hyperspectral volumetric microscopy that computationally reconstructs hyperspectral profiles for high-resolution volumes of ~1000 μm × 1000 μm × 500 μm at video rate by a novel four-dimensional (4D) deconvolution algorithm. We validated the proposed approach with both numerical simulations for quantitative evaluation and various real experimental results on the prototype system. Different applications such as biological component analysis in bright field and spectral unmixing of multiple fluorescence are demonstrated. The experiments on moving fluorescent beads and GFP labelled drosophila larvae indicate the great potential of our method for observing multiple fluorescent markers in dynamic specimens.
Automated Analysis of Fluorescence Microscopy Images to Identify Protein-Protein Interactions
Venkatraman, S.; Doktycz, M. J.; Qi, H.; ...
2006-01-01
The identification of protein interactions is important for elucidating biological networks. One obstacle in comprehensive interaction studies is the analyses of large datasets, particularly those containing images. Development of an automated system to analyze an image-based protein interaction dataset is needed. Such an analysis system is described here, to automatically extract features from fluorescence microscopy images obtained from a bacterial protein interaction assay. These features are used to relay quantitative values that aid in the automated scoring of positive interactions. Experimental observations indicate that identifying at least 50% positive cells in an image is sufficient to detect a protein interaction.more » Based on this criterion, the automated system presents 100% accuracy in detecting positive interactions for a dataset of 16 images. Algorithms were implemented using MATLAB and the software developed is available on request from the authors.« less