Sample records for global quantitation methods

  1. Global combined precursor isotopic labeling and isobaric tagging (cPILOT) approach with selective MS(3) acquisition.

    PubMed

    Evans, Adam R; Robinson, Renã A S

    2013-11-01

    Recently, we reported a novel proteomics quantitation scheme termed "combined precursor isotopic labeling and isobaric tagging (cPILOT)" that allows for the identification and quantitation of nitrated peptides in as many as 12-16 samples in a single experiment. cPILOT offers enhanced multiplexing and posttranslational modification specificity, however excludes global quantitation for all peptides present in a mixture and underestimates reporter ion ratios similar to other isobaric tagging methods due to precursor co-isolation. Here, we present a novel chemical workflow for cPILOT that can be used for global tagging of all peptides in a mixture. Specifically, through low pH precursor dimethylation of tryptic or LysC peptides followed by high pH tandem mass tags, the same reporter ion can be used twice in a single experiment. Also, to improve triple-stage mass spectrometry (MS(3) ) data acquisition, a selective MS(3) method that focuses on product selection of the y1 fragment of lysine-terminated peptides is incorporated into the workflow. This novel cPILOT workflow has potential for global peptide quantitation that could lead to enhanced sample multiplexing and increase the number of quantifiable spectra obtained from MS(3) acquisition methods. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A study of the temporal robustness of the growing global container-shipping network

    PubMed Central

    Wang, Nuo; Wu, Nuan; Dong, Ling-ling; Yan, Hua-kun; Wu, Di

    2016-01-01

    Whether they thrive as they grow must be determined for all constantly expanding networks. However, few studies have focused on this important network feature or the development of quantitative analytical methods. Given the formation and growth of the global container-shipping network, we proposed the concept of network temporal robustness and quantitative method. As an example, we collected container liner companies’ data at two time points (2004 and 2014) and built a shipping network with ports as nodes and routes as links. We thus obtained a quantitative value of the temporal robustness. The temporal robustness is a significant network property because, for the first time, we can clearly recognize that the shipping network has become more vulnerable to damage over the last decade: When the node failure scale reached 50% of the entire network, the temporal robustness was approximately −0.51% for random errors and −12.63% for intentional attacks. The proposed concept and analytical method described in this paper are significant for other network studies. PMID:27713549

  3. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  4. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  5. Comparative and Quantitative Global Proteomics Approaches: An Overview

    PubMed Central

    Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis

    2013-01-01

    Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403

  6. Does Delivery Format Make a Difference in Learning about Global and Cultural Understanding?

    ERIC Educational Resources Information Center

    Rawls, Janita; Hammons, Stacy A.

    2016-01-01

    This study assessed a learning outcome for nontraditional seniors who were in accelerated degree programs in both online and on-site formats. Using items from the National Survey of Student Engagement, researchers explored engagement with global understanding and cultural awareness. A quantitative, single-case analysis method was used to determine…

  7. Quantitative comparison of in situ soil CO2 flux measurement methods

    Treesearch

    Jennifer D. Knoepp; James M. Vose

    2002-01-01

    Development of reliable regional or global carbon budgets requires accurate measurement of soil CO2 flux. We conducted laboratory and field studies to determine the accuracy and comparability of methods commonly used to measure in situ soil CO2 fluxes. Methods compared included CO2...

  8. Integrated and global pseudotargeted metabolomics strategy applied to screening for quality control markers of Citrus TCMs.

    PubMed

    Shu, Yisong; Liu, Zhenli; Zhao, Siyu; Song, Zhiqian; He, Dan; Wang, Menglei; Zeng, Honglian; Lu, Cheng; Lu, Aiping; Liu, Yuanyan

    2017-08-01

    Traditional Chinese medicine (TCM) exerts its therapeutic effect in a holistic fashion with the synergistic function of multiple characteristic constituents. The holism philosophy of TCM is coincident with global and systematic theories of metabolomics. The proposed pseudotargeted metabolomics methodologies were employed for the establishment of reliable quality control markers for use in the screening strategy of TCMs. Pseudotargeted metabolomics integrates the advantages of both targeted and untargeted methods. In the present study, targeted metabolomics equipped with the gold standard RRLC-QqQ-MS method was employed for in vivo quantitative plasma pharmacochemistry study of characteristic prototypic constituents. Meanwhile, untargeted metabolomics using UHPLC-QE Orbitrap HRMS with better specificity and selectivity was employed for identification of untargeted metabolites in the complex plasma matrix. In all, 32 prototypic metabolites were quantitatively determined, and 66 biotransformed metabolites were convincingly identified after being orally administered with standard extracts of four labeled Citrus TCMs. The global absorption and metabolism process of complex TCMs was depicted in a systematic manner.

  9. Markov State Models of gene regulatory networks.

    PubMed

    Chu, Brian K; Tse, Margaret J; Sato, Royce R; Read, Elizabeth L

    2017-02-06

    Gene regulatory networks with dynamics characterized by multiple stable states underlie cell fate-decisions. Quantitative models that can link molecular-level knowledge of gene regulation to a global understanding of network dynamics have the potential to guide cell-reprogramming strategies. Networks are often modeled by the stochastic Chemical Master Equation, but methods for systematic identification of key properties of the global dynamics are currently lacking. The method identifies the number, phenotypes, and lifetimes of long-lived states for a set of common gene regulatory network models. Application of transition path theory to the constructed Markov State Model decomposes global dynamics into a set of dominant transition paths and associated relative probabilities for stochastic state-switching. In this proof-of-concept study, we found that the Markov State Model provides a general framework for analyzing and visualizing stochastic multistability and state-transitions in gene networks. Our results suggest that this framework-adopted from the field of atomistic Molecular Dynamics-can be a useful tool for quantitative Systems Biology at the network scale.

  10. Use of local noise power spectrum and wavelet analysis in quantitative image quality assurance for EPIDs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soyoung

    Purpose: To investigate the use of local noise power spectrum (NPS) to characterize image noise and wavelet analysis to isolate defective pixels and inter-subpanel flat-fielding artifacts for quantitative quality assurance (QA) of electronic portal imaging devices (EPIDs). Methods: A total of 93 image sets including custom-made bar-pattern images and open exposure images were collected from four iViewGT a-Si EPID systems over three years. Global quantitative metrics such as modulation transform function (MTF), NPS, and detective quantum efficiency (DQE) were computed for each image set. Local NPS was also calculated for individual subpanels by sampling region of interests within each subpanelmore » of the EPID. The 1D NPS, obtained by radially averaging the 2D NPS, was fitted to a power-law function. The r-square value of the linear regression analysis was used as a singular metric to characterize the noise properties of individual subpanels of the EPID. The sensitivity of the local NPS was first compared with the global quantitative metrics using historical image sets. It was then compared with two commonly used commercial QA systems with images collected after applying two different EPID calibration methods (single-level gain and multilevel gain). To detect isolated defective pixels and inter-subpanel flat-fielding artifacts, Haar wavelet transform was applied on the images. Results: Global quantitative metrics including MTF, NPS, and DQE showed little change over the period of data collection. On the contrary, a strong correlation between the local NPS (r-square values) and the variation of the EPID noise condition was observed. The local NPS analysis indicated image quality improvement with the r-square values increased from 0.80 ± 0.03 (before calibration) to 0.85 ± 0.03 (after single-level gain calibration) and to 0.96 ± 0.03 (after multilevel gain calibration), while the commercial QA systems failed to distinguish the image quality improvement between the two calibration methods. With wavelet analysis, defective pixels and inter-subpanel flat-fielding artifacts were clearly identified as spikes after thresholding the inversely transformed images. Conclusions: The proposed local NPS (r-square values) showed superior sensitivity to the noise level variations of individual subpanels compared with global quantitative metrics such as MTF, NPS, and DQE. Wavelet analysis was effective in detecting isolated defective pixels and inter-subpanel flat-fielding artifacts. The proposed methods are promising for the early detection of imaging artifacts of EPIDs.« less

  11. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  12. Construction Theory and Noise Analysis Method of Global CGCS2000 Coordinate Frame

    NASA Astrophysics Data System (ADS)

    Jiang, Z.; Wang, F.; Bai, J.; Li, Z.

    2018-04-01

    The definition, renewal and maintenance of geodetic datum has been international hot issue. In recent years, many countries have been studying and implementing modernization and renewal of local geodetic reference coordinate frame. Based on the precise result of continuous observation for recent 15 years from state CORS (continuously operating reference system) network and the mainland GNSS (Global Navigation Satellite System) network between 1999 and 2007, this paper studies the construction of mathematical model of the Global CGCS2000 frame, mainly analyzes the theory and algorithm of two-step method for Global CGCS2000 Coordinate Frame formulation. Finally, the noise characteristic of the coordinate time series are estimated quantitatively with the criterion of maximum likelihood estimation.

  13. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  14. Application of adenosine triphosphate affinity probe and scheduled multiple-reaction monitoring analysis for profiling global kinome in human cells in response to arsenite treatment.

    PubMed

    Guo, Lei; Xiao, Yongsheng; Wang, Yinsheng

    2014-11-04

    Phosphorylation of cellular components catalyzed by kinases plays important roles in cell signaling and proliferation. Quantitative assessment of perturbation in global kinome may provide crucial knowledge for elucidating the mechanisms underlying the cytotoxic effects of environmental toxicants. Here, we utilized an adenosine triphosphate (ATP) affinity probe coupled with stable isotope labeling by amino acids in cell culture (SILAC) to assess quantitatively the arsenite-induced alteration of global kinome in human cells. We constructed a SILAC-compatible kinome library for scheduled multiple-reaction monitoring (MRM) analysis and adopted on-the-fly recalibration of retention time shift, which provided better throughput of the analytical method and enabled the simultaneous quantification of the expression of ∼300 kinases in two LC-MRM runs. With this improved analytical method, we conducted an in-depth quantitative analysis of the perturbation of kinome of GM00637 human skin fibroblast cells induced by arsenite exposure. Several kinases involved in cell cycle progression, including cyclin-dependent kinases (CDK1 and CDK4) and Aurora kinases A, B, and C, were found to be hyperactivated, and the altered expression of CDK1 was further validated by Western analysis. In addition, treatment with a CDK inhibitor, flavopiridol, partially restored the arsenite-induced growth inhibition of human skin fibroblast cells. Thus, sodium arsenite may confer its cytotoxic effect partly through the aberrant activation of CDKs and the resultant perturbation of cell cycle progression. Together, we developed a high-throughput, SILAC-compatible, and MRM-based kinome profiling method and demonstrated that the method is powerful in deciphering the molecular modes of action of a widespread environmental toxicant. The method should be generally applicable for uncovering the cellular pathways triggered by other extracellular stimuli.

  15. Application of Adenosine Triphosphate Affinity Probe and Scheduled Multiple-Reaction Monitoring Analysis for Profiling Global Kinome in Human Cells in Response to Arsenite Treatment

    PubMed Central

    2015-01-01

    Phosphorylation of cellular components catalyzed by kinases plays important roles in cell signaling and proliferation. Quantitative assessment of perturbation in global kinome may provide crucial knowledge for elucidating the mechanisms underlying the cytotoxic effects of environmental toxicants. Here, we utilized an adenosine triphosphate (ATP) affinity probe coupled with stable isotope labeling by amino acids in cell culture (SILAC) to assess quantitatively the arsenite-induced alteration of global kinome in human cells. We constructed a SILAC-compatible kinome library for scheduled multiple-reaction monitoring (MRM) analysis and adopted on-the-fly recalibration of retention time shift, which provided better throughput of the analytical method and enabled the simultaneous quantification of the expression of ∼300 kinases in two LC-MRM runs. With this improved analytical method, we conducted an in-depth quantitative analysis of the perturbation of kinome of GM00637 human skin fibroblast cells induced by arsenite exposure. Several kinases involved in cell cycle progression, including cyclin-dependent kinases (CDK1 and CDK4) and Aurora kinases A, B, and C, were found to be hyperactivated, and the altered expression of CDK1 was further validated by Western analysis. In addition, treatment with a CDK inhibitor, flavopiridol, partially restored the arsenite-induced growth inhibition of human skin fibroblast cells. Thus, sodium arsenite may confer its cytotoxic effect partly through the aberrant activation of CDKs and the resultant perturbation of cell cycle progression. Together, we developed a high-throughput, SILAC-compatible, and MRM-based kinome profiling method and demonstrated that the method is powerful in deciphering the molecular modes of action of a widespread environmental toxicant. The method should be generally applicable for uncovering the cellular pathways triggered by other extracellular stimuli. PMID:25301106

  16. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    PubMed

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P < .001). There were no significant differences among LLC, T2-weighted short inversion time inversion recovery (STIR) sequences, early (EGE), and late (LGE) gadolinium-enhancement sequences for diagnosis of AM. The AUC for qualitative (T2-weighted STIR 0.92, EGE 0.87 and LGE 0.88) and quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  17. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18O-Labeling Method for Quantitative Proteomics

    PubMed Central

    López-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-01-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min with a minimized amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from the bacteria Shewanella oneidensis, and mouse plasma, as well as 18O labeling of such complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, rapid, and thus well-suited for automation. PMID:19555078

  18. Global, quantitative and dynamic mapping of protein subcellular localization.

    PubMed

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh

    2016-06-09

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.

  19. Negative health system effects of Global Fund's investments in AIDS, tuberculosis and malaria from 2002 to 2009: systematic review.

    PubMed

    Car, Josip; Paljärvi, Tapio; Car, Mate; Kazeem, Ayodele; Majeed, Azeem; Atun, Rifat

    2012-10-01

    By using the Global Fund as a case example, we aim to critically evaluate the evidence generated from 2002 to 2009 for potential negative health system effects of Global Health Initiatives (GHI). Systematic review of research literature. Developing Countries. All interventions potentially affecting health systems that were funded by the Global Fund. Negative health system effects of Global Fund investments as reported by study authors. We identified 24 studies commenting on adverse effects on health systems arising from Global Fund investments. Sixteen were quantitative studies, six were qualitative and two used both quantitative and qualitative methods, but none explicitly stated that the studies were originally designed to capture or to assess health system effects (positive or negative). Only seemingly anecdotal evidence or authors' perceptions/interpretations of circumstances could be extracted from the included studies. This study shows that much of the currently available evidence generated between 2002 and 2009 on GHIs potential negative health system effects is not of the quality expected or needed to best serve the academic or broader community. The majority of the reviewed research did not fulfil the requirements of rigorous scientific evidence.

  20. Indicators of Family Care for Development for Use in Multicountry Surveys

    PubMed Central

    Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima

    2012-01-01

    Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914

  1. Analyzing the impacts of global trade and investment on non-communicable diseases and risk factors: a critical review of methodological approaches used in quantitative analyses.

    PubMed

    Cowling, Krycia; Thow, Anne Marie; Pollack Porter, Keshia

    2018-05-24

    A key mechanism through which globalization has impacted health is the liberalization of trade and investment, yet relatively few studies to date have used quantitative methods to investigate the impacts of global trade and investment policies on non-communicable diseases and risk factors. Recent reviews of this literature have found heterogeneity in results and a range of quality across studies, which may be in part attributable to a lack of conceptual clarity and methodological inconsistencies. This study is a critical review of methodological approaches used in the quantitative literature on global trade and investment and diet, tobacco, alcohol, and related health outcomes, with the objective of developing recommendations and providing resources to guide future robust, policy relevant research. A review of reviews, expert review, and reference tracing were employed to identify relevant studies, which were evaluated using a novel quality assessment tool designed for this research. Eight review articles and 34 quantitative studies were identified for inclusion. Important ways to improve this literature were identified and discussed: clearly defining exposures of interest and not conflating trade and investment; exploring mechanisms of broader relationships; increasing the use of individual-level data; ensuring consensus and consistency in key confounding variables; utilizing more sector-specific versus economy-wide trade and investment indicators; testing and adequately adjusting for autocorrelation and endogeneity when using longitudinal data; and presenting results from alternative statistical models and sensitivity analyses. To guide the development of future analyses, recommendations for international data sources for selected trade and investment indicators, as well as key gaps in the literature, are presented. More methodologically rigorous and consistent approaches in future quantitative studies on the impacts of global trade and investment policies on non-communicable diseases and risk factors can help to resolve inconsistencies of existing research and generate useful information to guide policy decisions.

  2. Environmental Sustainability - Including Land and Water Use

    EPA Science Inventory

    Assessments of environmental sustainability can be conducted in many ways with one of the most quantitative methods including Life Cycle Impact Assessment (LCIA). While historically LCIA has included a comprehensive list of impact categories including: ozone depletion, global c...

  3. Quantitative assessment of the differential impacts of arbuscular and ectomycorrhiza on soil carbon cycling.

    PubMed

    Soudzilovskaia, Nadejda A; van der Heijden, Marcel G A; Cornelissen, Johannes H C; Makarov, Mikhail I; Onipchenko, Vladimir G; Maslov, Mikhail N; Akhmetzhanova, Asem A; van Bodegom, Peter M

    2015-10-01

    A significant fraction of carbon stored in the Earth's soil moves through arbuscular mycorrhiza (AM) and ectomycorrhiza (EM). The impacts of AM and EM on the soil carbon budget are poorly understood. We propose a method to quantify the mycorrhizal contribution to carbon cycling, explicitly accounting for the abundance of plant-associated and extraradical mycorrhizal mycelium. We discuss the need to acquire additional data to use our method, and present our new global database holding information on plant species-by-site intensity of root colonization by mycorrhizas. We demonstrate that the degree of mycorrhizal fungal colonization has globally consistent patterns across plant species. This suggests that the level of plant species-specific root colonization can be used as a plant trait. To exemplify our method, we assessed the differential impacts of AM : EM ratio and EM shrub encroachment on carbon stocks in sub-arctic tundra. AM and EM affect tundra carbon stocks at different magnitudes, and via partly distinct dominant pathways: via extraradical mycelium (both EM and AM) and via mycorrhizal impacts on above- and belowground biomass carbon (mostly AM). Our method provides a powerful tool for the quantitative assessment of mycorrhizal impact on local and global carbon cycling processes, paving the way towards an improved understanding of the role of mycorrhizas in the Earth's carbon cycle. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  4. Influence of an Education Abroad Program on the Intercultural Sensitivity of STEM Undergraduates: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Demetry, Chrysanthe; Vaz, Richard F.

    2017-01-01

    Education abroad programs are becoming more common as a mechanism for developing the global competencies of engineering graduates. An increasing body of research shows that intercultural learning does not occur "de facto" in such programs. This study used quantitative and qualitative methods to explore changes in students' intercultural…

  5. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  6. Automated extraction of pleural effusion in three-dimensional thoracic CT images

    NASA Astrophysics Data System (ADS)

    Kido, Shoji; Tsunomori, Akinori

    2009-02-01

    It is important for diagnosis of pulmonary diseases to measure volume of accumulating pleural effusion in threedimensional thoracic CT images quantitatively. However, automated extraction of pulmonary effusion correctly is difficult. Conventional extraction algorithm using a gray-level based threshold can not extract pleural effusion from thoracic wall or mediastinum correctly, because density of pleural effusion in CT images is similar to those of thoracic wall or mediastinum. So, we have developed an automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion. Our method used a template of lung obtained from a normal lung for segmentation of lungs with pleural effusions. Registration process consisted of two steps. First step was a global matching processing between normal and abnormal lungs of organs such as bronchi, bones (ribs, sternum and vertebrae) and upper surfaces of livers which were extracted using a region-growing algorithm. Second step was a local matching processing between normal and abnormal lungs which were deformed by the parameter obtained from the global matching processing. Finally, we segmented a lung with pleural effusion by use of the template which was deformed by two parameters obtained from the global matching processing and the local matching processing. We compared our method with a conventional extraction method using a gray-level based threshold and two published methods. The extraction rates of pleural effusions obtained from our method were much higher than those obtained from other methods. Automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion is promising for diagnosis of pulmonary diseases by providing quantitative volume of accumulating pleural effusion.

  7. Global, quantitative and dynamic mapping of protein subcellular localization

    PubMed Central

    Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH

    2016-01-01

    Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775

  8. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  9. Empowering people to change occupational behaviours to address critical global issues.

    PubMed

    Ikiugu, Moses N; Westerfield, Madeline A; Lien, Jamie M; Theisen, Emily R; Cerny, Shana L; Nissen, Ranelle M

    2015-06-01

    The greatest threat to human well-being in this century is climate change and related global issues. We examined the effectiveness of the Modified Instrumentalism in Occupational Therapy model as a framework for facilitating occupational behaviour change to address climate change and related issues. Eleven individuals participated in this mixed-methods single-subject-design study. Data were gathered using the Modified Assessment and Intervention Instrument for Instrumentalism in Occupational Therapy and Daily Occupational Inventories. Quantitative data were analyzed using two- and three-standard deviation band methods. Qualitative data were analyzed using heuristic phenomenological procedures. Occupational performance changed for five participants. Participants' feelings shifted from frustration and helplessness to empowerment and a desire for action. They felt empowered to find occupation-based solutions to the global issues. Occupation-based interventions that increase personal awareness of the connection between occupational performance and global issues could empower people to be agents for action to ameliorate the issues.

  10. Orthogonal Comparison of GC-MS and 1H NMR Spectroscopy for Short Chain Fatty Acid Quantitation.

    PubMed

    Cai, Jingwei; Zhang, Jingtao; Tian, Yuan; Zhang, Limin; Hatzakis, Emmanuel; Krausz, Kristopher W; Smith, Philip B; Gonzalez, Frank J; Patterson, Andrew D

    2017-08-01

    Short chain fatty acids (SCFAs) are important regulators of host physiology and metabolism and may contribute to obesity and associated metabolic diseases. Interest in SCFAs has increased in part due to the recognized importance of how production of SCFAs by the microbiota may signal to the host. Therefore, reliable, reproducible, and affordable methods for SCFA profiling are required for accurate identification and quantitation. In the current study, four different methods for SCFA (acetic acid, propionic acid, and butyric acid) extraction and quantitation were compared using two independent platforms including gas chromatography coupled with mass spectrometry (GC-MS) and 1 H nuclear magnetic resonance (NMR) spectroscopy. Sensitivity, recovery, repeatability, matrix effect, and validation using mouse fecal samples were determined across all methods. The GC-MS propyl esterification method exhibited superior sensitivity for acetic acid and butyric acid measurement (LOD < 0.01 μg mL -1 , LOQ < 0.1 μg mL -1 ) and recovery accuracy (99.4%-108.3% recovery rate for 100 μg mL -1 SCFA mixed standard spike in and 97.8%-101.8% recovery rate for 250 μg mL -1 SCFAs mixed standard spike in). NMR methods by either quantitation relative to an internal standard or quantitation using a calibration curve yielded better repeatability and minimal matrix effects compared to GC-MS methods. All methods generated good calibration curve linearity (R 2 > 0.99) and comparable measurement of fecal SCFA concentration. Lastly, these methods were used to quantitate fecal SCFAs obtained from conventionally raised (CONV-R) and germ free (GF) mice. Results from global metabolomic analysis of feces generated by 1 H NMR and bomb calorimetry were used to further validate these approaches.

  11. Behavioral Economics and Empirical Public Policy

    ERIC Educational Resources Information Center

    Hursh, Steven R.; Roma, Peter G.

    2013-01-01

    The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively…

  12. Endobiogeny: a global approach to systems biology (part 1 of 2).

    PubMed

    Lapraz, Jean-Claude; Hedayat, Kamyar M

    2013-01-01

    Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.

  13. A method for examining temporal changes in cyanobacterial harmful algal bloom spatial extent using satellite remote sensing..

    EPA Science Inventory

    Cyanobacterial harmful algal blooms (CyanoHAB) are thought to be increasing globally over the past few decades, but relatively little quantitative information is available about the spatial extent of blooms. Satellite remote sensing provides a potential technology for identifying...

  14. Global gene expression analysis by combinatorial optimization.

    PubMed

    Ameur, Adam; Aurell, Erik; Carlsson, Mats; Westholm, Jakub Orzechowski

    2004-01-01

    Generally, there is a trade-off between methods of gene expression analysis that are precise but labor-intensive, e.g. RT-PCR, and methods that scale up to global coverage but are not quite as quantitative, e.g. microarrays. In the present paper, we show how how a known method of gene expression profiling (K. Kato, Nucleic Acids Res. 23, 3685-3690 (1995)), which relies on a fairly small number of steps, can be turned into a global gene expression measurement by advanced data post-processing, with potentially little loss of accuracy. Post-processing here entails solving an ancillary combinatorial optimization problem. Validation is performed on in silico experiments generated from the FANTOM data base of full-length mouse cDNA. We present two variants of the method. One uses state-of-the-art commercial software for solving problems of this kind, the other a code developed by us specifically for this purpose, released in the public domain under GPL license.

  15. The Mistreatment of Women during Childbirth in Health Facilities Globally: A Mixed-Methods Systematic Review.

    PubMed

    Bohren, Meghan A; Vogel, Joshua P; Hunter, Erin C; Lutsiv, Olha; Makh, Suprita K; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T; Khosla, Rajat; Hindin, Michelle J; Gülmezoglu, A Metin

    2015-06-01

    Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions.

  16. Towards collation and modelling of the global cost of armed violence on civilians.

    PubMed

    Taback, Nathan; Coupland, Robin

    2005-01-01

    A method is described which translates qualitative reports about armed violence into meaningful quantitative data allowing an evidence-based approach to the causes and effects of the global health impact of armed violence on unarmed people. Analysis of 100 randomly selected news reports shows that the type of weapon used, the psychological aspect of the violence, the number of weapons in use and the victims' vulnerability independently influence the mortality of victims. Data collated by the same method could be analysed together with indicators of poverty, development and health so illuminating the relationship between such indicators and degradation of peoples' physical security through acts of armed violence. The method could also help uphold the laws of war and human rights.

  17. Evaluation of body-wise and organ-wise registrations for abdominal organs

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Panjwani, Sahil A.; Lee, Christopher P.; Burke, Ryan P.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Abramson, Richard G.; Landman, Bennett A.

    2016-03-01

    Identifying cross-sectional and longitudinal correspondence in the abdomen on computed tomography (CT) scans is necessary for quantitatively tracking change and understanding population characteristics, yet abdominal image registration is a challenging problem. The key difficulty in solving this problem is huge variations in organ dimensions and shapes across subjects. The current standard registration method uses the global or body-wise registration technique, which is based on the global topology for alignment. This method (although producing decent results) has substantial influence of outliers, thus leaving room for significant improvement. Here, we study a new image registration approach using local (organ-wise registration) by first creating organ-specific bounding boxes and then using these regions of interest (ROIs) for aligning references to target. Based on Dice Similarity Coefficient (DSC), Mean Surface Distance (MSD) and Hausdorff Distance (HD), the organ-wise approach is demonstrated to have significantly better results by minimizing the distorting effects of organ variations. This paper compares exclusively the two registration methods by providing novel quantitative and qualitative comparison data and is a subset of the more comprehensive problem of improving the multi-atlas segmentation by using organ normalization.

  18. A Dual-Color Reporter Assay of Cohesin-Mediated Gene Regulation in Budding Yeast Meiosis.

    PubMed

    Fan, Jinbo; Jin, Hui; Yu, Hong-Guo

    2017-01-01

    In this chapter, we describe a quantitative fluorescence-based assay of gene expression using the ratio of the reporter green fluorescence protein (GFP) to the internal red fluorescence protein (RFP) control. With this dual-color heterologous reporter assay, we have revealed cohesin-regulated genes and discovered a cis-acting DNA element, the Ty1-LTR, which interacts with cohesin and regulates gene expression during yeast meiosis. The method described here provides an effective cytological approach for quantitative analysis of global gene expression in budding yeast meiosis.

  19. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  20. Spatial and Global Sensory Suppression Mapping Encompassing the Central 10° Field in Anisometropic Amblyopia.

    PubMed

    Li, Jingjing; Li, Jinrong; Chen, Zidong; Liu, Jing; Yuan, Junpeng; Cai, Xiaoxiao; Deng, Daming; Yu, Minbin

    2017-01-01

    We investigate the efficacy of a novel dichoptic mapping paradigm in evaluating visual function of anisometropic amblyopes. Using standard clinical measures of visual function (visual acuity, stereo acuity, Bagolini lenses, and neutral density filters) and a novel quantitative mapping technique, 26 patients with anisometropic amblyopia (mean age = 19.15 ± 4.42 years) were assessed. Two additional psychophysical interocular suppression measurements were tested with dichoptic global motion coherence and binocular phase combination tasks. Luminance reduction was achieved by placing neutral density filters in front of the normal eye. Our study revealed that suppression changes across the central 10° visual field by mean luminance modulation in amblyopes as well as normal controls. Using simulation and an elimination of interocular suppression, we identified a novel method to effectively reflect the distribution of suppression in anisometropic amblyopia. Additionally, the new quantitative mapping technique was in good agreement with conventional clinical measures, such as interocular acuity difference (P < 0.001) and stereo acuity (P = 0.005). There was a good consistency between the results of interocular suppression with dichoptic mapping paradigm and the results of the other two psychophysical methods (suppression mapping versus binocular phase combination, P < 0.001; suppression mapping versus global motion coherence, P = 0.005). The dichoptic suppression mapping technique is an effective method to represent impaired visual function in patients with anisometropic amblyopia. It offers a potential in "micro-"antisuppression mapping tests and therapies for amblyopia.

  1. The Mistreatment of Women during Childbirth in Health Facilities Globally: A Mixed-Methods Systematic Review

    PubMed Central

    Bohren, Meghan A.; Vogel, Joshua P.; Hunter, Erin C.; Lutsiv, Olha; Makh, Suprita K.; Souza, João Paulo; Aguiar, Carolina; Saraiva Coneglian, Fernando; Diniz, Alex Luíz Araújo; Tunçalp, Özge; Javadi, Dena; Oladapo, Olufemi T.; Khosla, Rajat; Hindin, Michelle J.; Gülmezoglu, A. Metin

    2015-01-01

    Background Despite growing recognition of neglectful, abusive, and disrespectful treatment of women during childbirth in health facilities, there is no consensus at a global level on how these occurrences are defined and measured. This mixed-methods systematic review aims to synthesize qualitative and quantitative evidence on the mistreatment of women during childbirth in health facilities to inform the development of an evidence-based typology of the phenomenon. Methods and Findings We searched PubMed, CINAHL, and Embase databases and grey literature using a predetermined search strategy to identify qualitative, quantitative, and mixed-methods studies on the mistreatment of women during childbirth across all geographical and income-level settings. We used a thematic synthesis approach to synthesize the qualitative evidence and assessed the confidence in the qualitative review findings using the CERQual approach. In total, 65 studies were included from 34 countries. Qualitative findings were organized under seven domains: (1) physical abuse, (2) sexual abuse, (3) verbal abuse, (4) stigma and discrimination, (5) failure to meet professional standards of care, (6) poor rapport between women and providers, and (7) health system conditions and constraints. Due to high heterogeneity of the quantitative data, we were unable to conduct a meta-analysis; instead, we present descriptions of study characteristics, outcome measures, and results. Additional themes identified in the quantitative studies are integrated into the typology. Conclusions This systematic review presents a comprehensive, evidence-based typology of the mistreatment of women during childbirth in health facilities, and demonstrates that mistreatment can occur at the level of interaction between the woman and provider, as well as through systemic failures at the health facility and health system levels. We propose this typology be adopted to describe the phenomenon and be used to develop measurement tools and inform future research, programs, and interventions. PMID:26126110

  2. Quantitative Image Recovery From Measured Blind Backscattered Data Using a Globally Convergent Inverse Method

    DTIC Science & Technology

    2012-01-01

    research interests include in- 794 verse problems related to superresolution imaging and metamaterial design. 795 Dr. Fiddy is a Fellow of the Optical...verse problems related to superresolution imaging and metamaterial design. 795 Dr. Fiddy is a Fellow of the Optical Society of America, the IOP, and The

  3. A Review of Quantitative Methods for Evaluating Impacts of Climate Change on Urban Water Infrastructure

    EPA Science Inventory

    It is widely accepted that global climate change will impact the regional and local climate and alter some aspects of the hydrologic cycle, which in turn can affect the performance of the urban water supply, wastewater and storm water infrastructur4e. How the urban water infrastr...

  4. Locally Weighted Learning Methods for Predicting Dose-Dependent Toxicity with Application to the Human Maximum Recommended Daily Dose

    DTIC Science & Technology

    2012-09-10

    Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, Maryland 21702, United States ABSTRACT: Toxicological ...species. Thus, it is more advantageous to predict the toxicological effects of a compound on humans directly from the human toxicological data of related...compounds. However, many popular quantitative structure−activity relationship ( QSAR ) methods that build a single global model by fitting all training

  5. Assessing the performance of quantitative image features on early stage prediction of treatment effectiveness for ovary cancer patients: a preliminary investigation

    NASA Astrophysics Data System (ADS)

    Zargari, Abolfazl; Du, Yue; Thai, Theresa C.; Gunderson, Camille C.; Moore, Kathleen; Mannel, Robert S.; Liu, Hong; Zheng, Bin; Qiu, Yuchen

    2018-02-01

    The objective of this study is to investigate the performance of global and local features to better estimate the characteristics of highly heterogeneous metastatic tumours, for accurately predicting the treatment effectiveness of the advanced stage ovarian cancer patients. In order to achieve this , a quantitative image analysis scheme was developed to estimate a total of 103 features from three different groups including shape and density, Wavelet, and Gray Level Difference Method (GLDM) features. Shape and density features are global features, which are directly applied on the entire target image; wavelet and GLDM features are local features, which are applied on the divided blocks of the target image. To assess the performance, the new scheme was applied on a retrospective dataset containing 120 recurrent and high grade ovary cancer patients. The results indicate that the three best performed features are skewness, root-mean-square (rms) and mean of local GLDM texture, indicating the importance of integrating local features. In addition, the averaged predicting performance are comparable among the three different categories. This investigation concluded that the local features contains at least as copious tumour heterogeneity information as the global features, which may be meaningful on improving the predicting performance of the quantitative image markers for the diagnosis and prognosis of ovary cancer patients.

  6. Quantitative Trait Locus Analysis of SIX1-SIX6 with Retinal Nerve Fiber Layer Thickness in Individuals of European Descent

    PubMed Central

    Kuo, Jane Z.; Zangwill, Linda M.; Medeiros, Felipe A.; Liebmann, Jeffery M.; Girkin, Christopher A.; Hammel, Na’ama; Rotter, Jerome I.; Weinreb, Robert N.

    2015-01-01

    Purpose To perform a quantitative trait locus (QTL) analysis and evaluate whether a locus between SIX1 and SIX6 is associated with retinal nerve fiber layer (RNFL) thickness in individuals of European descent. Design Observational, multi-center, cross-sectional study. Methods 231 participants were recruited from the Diagnostic Innovations in Glaucoma Study and the African Descent and Glaucoma Evaluation Study. Association of rs10483727 in SIX1-SIX6 with global and sectoral RNFL thickness was performed. Quantitative trait analysis with the additive model of inheritance was analyzed using linear regression. Trend analysis was performed to evaluate the mean global and sectoral RNFL thickness with 3 genotypes of interest (T/T, C/T, C/C). All models were adjusted for age and gender. Results Direction of association between T allele and RNFL thickness was consistent in the global and different sectoral RNFL regions. Each copy of the T risk allele in rs10483727 was associated with −0.16 μm thinner global RNFL thickness (β=−0.16, 95% CI: −0.28 to −0.03; P=0.01). Similar patterns were found for the sectoral regions, including inferior (P=0.03), inferior-nasal (P=0.017), superior-nasal (P=0.0025), superior (P=0.002) and superior-temporal (P=0.008). The greatest differences were observed in the superior and inferior quadrants, supporting clinical observations for RNFL thinning in glaucoma. Thinner global RNFL was found in subjects with T/T genotypes compared to subjects with C/T and C/C genotypes (P=0.044). Conclusions Each copy of the T risk allele has an additive effect and was associated with thinner global and sectoral RNFL. Findings from this QTL analysis further support a genetic contribution to glaucoma pathophysiology. PMID:25849520

  7. [Globalization of acupuncture technology innovation: a quantitative analysis based on acupuncture patents in the U.S.A].

    PubMed

    Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao

    2011-08-01

    The structure of international flow of acupuncture knowledge was explored in this article so as to promote the globalization of acupuncture technology innovation. Statistical methods were adopted to reveal geographical distribution of acupuncture patents in the U.S.A. and the influencing factors of cumulative advantage of acupuncture techniques as well as innovation value of application of acupuncture patents. Social network analysis was also utilized to establish a global innovation network of acupuncture technology. The result shows that the cumulative strength on acupuncture technology correlates with the patent retention period. The innovative value of acupuncture invention correlates with the frequency of patent citation. And the U. S. A. and Canada seize central positions in the global acupuncture information and technology delivery system.

  8. Global estimates of country health indicators: useful, unnecessary, inevitable?

    PubMed Central

    AbouZahr, Carla; Boerma, Ties; Hogan, Daniel

    2017-01-01

    ABSTRACT Background: The MDG era relied on global health estimates to fill data gaps and ensure temporal and cross-country comparability in reporting progress. Monitoring the Sustainable Development Goals will present new challenges, requiring enhanced capacities to generate, analyse, interpret and use country produced data. Objective: To summarize the development of global health estimates and discuss their utility and limitations from global and country perspectives. Design: Descriptive paper based on findings of intercountry workshops, reviews of literatureon and synthesis of experiences. Results: Producers of global health estimates focus on the technical soundness of estimation methods and comparability of the results across countries and over time. By contrast, country users are more concerned about the extent of their involvement in the estimation process and hesitate to buy into estimates derived using methods their technical staff cannot explain and that differ from national data sources. Quantitative summaries of uncertainty may be of limited practical use in policy discussions where decisions need to be made about what to do next. Conclusions: Greater transparency and involvement of country partners in the development of global estimates will help improve ownership, strengthen country capacities for data production and use, and reduce reliance on externally produced estimates. PMID:28532307

  9. An improved method for quantitatively measuring the sequences of total organic carbon and black carbon in marine sediment cores

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoming; Zhu, Qing; Zhou, Qianzhi; Liu, Jinzhong; Yuan, Jianping; Wang, Jianghai

    2018-01-01

    Understanding global carbon cycle is critical to uncover the mechanisms of global warming and remediate its adverse effects on human activities. Organic carbon in marine sediments is an indispensable part of the global carbon reservoir in global carbon cycling. Evaluating such a reservoir calls for quantitative studies of marine carbon burial, which closely depend on quantifying total organic carbon and black carbon in marine sediment cores and subsequently on obtaining their high-resolution temporal sequences. However, the conventional methods for detecting the contents of total organic carbon or black carbon cannot resolve the following specific difficulties, i.e., (1) a very limited amount of each subsample versus the diverse analytical items, (2) a low and fluctuating recovery rate of total organic carbon or black carbon versus the reproducibility of carbon data, and (3) a large number of subsamples versus the rapid batch measurements. In this work, (i) adopting the customized disposable ceramic crucibles with the microporecontrolled ability, (ii) developing self-made or customized facilities for the procedures of acidification and chemothermal oxidization, and (iii) optimizing procedures and carbon-sulfur analyzer, we have built a novel Wang-Xu-Yuan method (the WXY method) for measuring the contents of total organic carbon or black carbon in marine sediment cores, which includes the procedures of pretreatment, weighing, acidification, chemothermal oxidation and quantification; and can fully meet the requirements of establishing their highresolution temporal sequences, whatever in the recovery, experimental efficiency, accuracy and reliability of the measurements, and homogeneity of samples. In particular, the usage of disposable ceramic crucibles leads to evidently simplify the experimental scenario, which further results in the very high recovery rates for total organic carbon and black carbon. This new technique may provide a significant support for revealing the mechanism of carbon burial and evaluating the capacity of marine carbon accumulation and sequestration.

  10. Informing urban carbon emissions with atmospheric observations: motivation, methods, and reducing uncertainties.

    NASA Astrophysics Data System (ADS)

    Kort, E. A.; Ware, J.; Duren, R. M.; Schimel, D.; Miller, C. E.; Decola, P.

    2014-12-01

    Urban regions play a dominant role in the anthropogenic perturbation to atmospheric carbon dioxide and methane. With increasing urbanization (notably in developing nations) and increasing emissions, quantitative observational information on emissions of CO2 and CH4 becomes critical for improved understanding of the global carbon cycle and for carbon management/policy decisions. In this presentation, we will discuss the impact uncertainty in anthropogenic emissions has on global carbon-climate understanding, providing broad geophysical motivation for urban studies. We will further discuss observations of urban regions at different scales (satellite vs. in-situ), and investigate the information content of these complementary methods for answering targeted questions on both global carbon fluxes and regional management decisions. Finally, we will present new attempts at reducing uncertainty in high-resolution inversions leveraging remotely sensed aerosol profiles to constrain both mixing depths and vertical distributions of trace gases.

  11. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  12. Quantitative microbial faecal source tracking with sampling guided by hydrological catchment dynamics.

    PubMed

    Reischer, G H; Haider, J M; Sommer, R; Stadler, H; Keiblinger, K M; Hornek, R; Zerobin, W; Mach, R L; Farnleitner, A H

    2008-10-01

    The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design.

  13. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  14. A Spatial Method to Calculate Small-Scale Fisheries Extent

    NASA Astrophysics Data System (ADS)

    Johnson, A. F.; Moreno-Báez, M.; Giron-Nava, A.; Corominas, J.; Erisman, B.; Ezcurra, E.; Aburto-Oropeza, O.

    2016-02-01

    Despite global catch per unit effort having redoubled since the 1950's, the global fishing fleet is estimated to be twice the size that the oceans can sustainably support. In order to gauge the collateral impacts of fishing intensity, we must be able to estimate the spatial extent and amount of fishing vessels in the oceans. Methods that do currently exist are built around electronic tracking and log book systems and generally focus on industrial fisheries. Spatial extent for small-scale fisheries therefore remains elusive for many small-scale fishing fleets; even though these fisheries land the same biomass for human consumption as industrial fisheries. Current methods are data-intensive and require extensive extrapolation when estimated across large spatial scales. We present an accessible, spatial method of calculating the extent of small-scale fisheries based on two simple measures that are available, or at least easily estimable, in even the most data poor fisheries: the number of boats and the local coastal human population. We demonstrate this method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This method provides an important first step towards estimating the fishing extent of the small-scale fleet, globally.

  15. Quantitative in vivo optical tomography of cancer progression & vasculature development in adult zebrafish

    PubMed Central

    Kumar, Sunil; Lockwood, Nicola; Ramel, Marie-Christine; Correia, Teresa; Ellis, Matthew; Alexandrov, Yuriy; Andrews, Natalie; Patel, Rachel; Bugeon, Laurence; Dallman, Margaret J.; Brandner, Sebastian; Arridge, Simon; Katan, Matilda; McGinty, James; Frankel, Paul; French, Paul M.W.

    2016-01-01

    We describe a novel approach to study tumour progression and vasculature development in vivo via global 3-D fluorescence imaging of live non-pigmented adult zebrafish utilising angularly multiplexed optical projection tomography with compressive sensing (CS-OPT). This “mesoscopic” imaging method bridges a gap between established ~μm resolution 3-D fluorescence microscopy techniques and ~mm-resolved whole body planar imaging and diffuse tomography. Implementing angular multiplexing with CS-OPT, we demonstrate the in vivo global imaging of an inducible fluorescently labelled genetic model of liver cancer in adult non-pigmented zebrafish that also present fluorescently labelled vasculature. In this disease model, addition of a chemical inducer (doxycycline) drives expression of eGFP tagged oncogenic K-RASV12 in the liver of immune competent animals. We show that our novel in vivo global imaging methodology enables non-invasive quantitative imaging of the development of tumour and vasculature throughout the progression of the disease, which we have validated against established methods of pathology including immunohistochemistry. We have also demonstrated its potential for longitudinal imaging through a study of vascular development in the same zebrafish from early embryo to adulthood. We believe that this instrument, together with its associated analysis and data management tools, constitute a new platform for in vivo cancer studies and drug discovery in zebrafish disease models. PMID:27259259

  16. Transgenerational epigenetics: Inheritance of global cytosine methylation and methylation-related epigenetic markers in the shrub Lavandula latifolia.

    PubMed

    Herrera, Carlos M; Alonso, Conchita; Medrano, Mónica; Pérez, Ricardo; Bazaga, Pilar

    2018-04-01

    The ecological and evolutionary significance of natural epigenetic variation (i.e., not based on DNA sequence variants) variation will depend critically on whether epigenetic states are transmitted from parents to offspring, but little is known on epigenetic inheritance in nonmodel plants. We present a quantitative analysis of transgenerational transmission of global DNA cytosine methylation (= proportion of all genomic cytosines that are methylated) and individual epigenetic markers (= methylation status of anonymous MSAP markers) in the shrub Lavandula latifolia. Methods based on parent-offspring correlations and parental variance component estimation were applied to epigenetic features of field-growing plants ('maternal parents') and greenhouse-grown progenies. Transmission of genetic markers (AFLP) was also assessed for reference. Maternal parents differed significantly in global DNA cytosine methylation (range = 21.7-36.7%). Greenhouse-grown maternal families differed significantly in global methylation, and their differences were significantly related to maternal origin. Methylation-sensitive amplified polymorphism (MSAP) markers exhibited significant transgenerational transmission, as denoted by significant maternal variance component of marker scores in greenhouse families and significant mother-offspring correlations of marker scores. Although transmission-related measurements for global methylation and MSAP markers were quantitatively lower than those for AFLP markers taken as reference, this study has revealed extensive transgenerational transmission of genome-wide global cytosine methylation and anonymous epigenetic markers in L. latifolia. Similarity of results for global cytosine methylation and epigenetic markers lends robustness to this conclusion, and stresses the value of considering both types of information in epigenetic studies of nonmodel plants. © 2018 Botanical Society of America.

  17. Herbal hepatotoxicity and WHO global introspection method.

    PubMed

    Teschke, Rolf; Eickhoff, Axel; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes

    2013-01-01

    Herbal hepatotoxicity is a rare but highly disputed disease because numerous confounding variables may complicate accurate causality assessment. Case evaluation is even more difficult when the WHO global introspection method (WHO method) is applied as diagnostic algorithm. This method lacks liver specificity, hepatotoxicity validation, and quantitative items, basic qualifications required for a sound evaluation of hepatotoxicity cases. Consequently, there are no data available for reliability, sensitivity, specificity, positive and negative predictive value. Its scope is also limited by the fact that it cannot discriminate between a positive and a negative causality attribution, thereby stimulating case overdiagnosing and overreporting. The WHO method ignores uncertainties regarding daily dose, temporal association, start, duration, and end of herbal use, time to onset of the adverse reaction, and course of liver values after herb discontinuation. Insufficiently considered or ignored are comedications, preexisting liver diseases, alternative explanations upon clinical assessment, and exclusion of infections by hepatitis A-C, cytomegalovirus (CMV), Epstein-Barr virus (EBV), herpes simplex virus (HSV), and varicella zoster virus (VZV). We clearly prefer as alternative the scale of CIOMS (Council for International Organizations of Medical Sciences) which is structured, quantitative, liver specific, and validated for hepatotoxicity. In conclusion, causality of herbal hepatotoxicity is best assessed by the liver specific CIOMS scale validated for hepatotoxicity rather than the obsolete WHO method that is liver unspecific and not validated for hepatotoxicity. CIOMS based assessments will ensure the correct diagnosis and exclude alternative diagnosis that may require other specific therapies.

  18. Determining suitable locations for seed transfer under climate change: a global quantitative method

    Treesearch

    Kevin M. Potter; William W. Hargrove

    2012-01-01

    Changing climate conditions will complicate efforts to match seed sources with the environments to which they are best adapted. Tree species distributions may have to shift to match new environmental conditions, potentially requiring the establishment of some species entirely outside of their current distributions to thrive. Even within the portions of tree species...

  19. Synthesis in land change science: methodological patterns, challenges, and guidelines.

    PubMed

    Magliocca, Nicholas R; Rudel, Thomas K; Verburg, Peter H; McConnell, William J; Mertz, Ole; Gerstner, Katharina; Heinimann, Andreas; Ellis, Erle C

    Global and regional economic and environmental changes are increasingly influencing local land-use, livelihoods, and ecosystems. At the same time, cumulative local land changes are driving global and regional changes in biodiversity and the environment. To understand the causes and consequences of these changes, land change science (LCS) draws on a wide array synthetic and meta-study techniques to generate global and regional knowledge from local case studies of land change. Here, we review the characteristics and applications of synthesis methods in LCS and assess the current state of synthetic research based on a meta-analysis of synthesis studies from 1995 to 2012. Publication of synthesis research is accelerating, with a clear trend toward increasingly sophisticated and quantitative methods, including meta-analysis. Detailed trends in synthesis objectives, methods, and land change phenomena and world regions most commonly studied are presented. Significant challenges to successful synthesis research in LCS are also identified, including issues of interpretability and comparability across case-studies and the limits of and biases in the geographic coverage of case studies. Nevertheless, synthesis methods based on local case studies will remain essential for generating systematic global and regional understanding of local land change for the foreseeable future, and multiple opportunities exist to accelerate and enhance the reliability of synthetic LCS research in the future. Demand for global and regional knowledge generation will continue to grow to support adaptation and mitigation policies consistent with both the local realities and regional and global environmental and economic contexts of land change.

  20. Quantitative Analysis Tools and Digital Phantoms for Deformable Image Registration Quality Assurance.

    PubMed

    Kim, Haksoo; Park, Samuel B; Monroe, James I; Traughber, Bryan J; Zheng, Yiran; Lo, Simon S; Yao, Min; Mansur, David; Ellis, Rodney; Machtay, Mitchell; Sohn, Jason W

    2015-08-01

    This article proposes quantitative analysis tools and digital phantoms to quantify intrinsic errors of deformable image registration (DIR) systems and establish quality assurance (QA) procedures for clinical use of DIR systems utilizing local and global error analysis methods with clinically realistic digital image phantoms. Landmark-based image registration verifications are suitable only for images with significant feature points. To address this shortfall, we adapted a deformation vector field (DVF) comparison approach with new analysis techniques to quantify the results. Digital image phantoms are derived from data sets of actual patient images (a reference image set, R, a test image set, T). Image sets from the same patient taken at different times are registered with deformable methods producing a reference DVFref. Applying DVFref to the original reference image deforms T into a new image R'. The data set, R', T, and DVFref, is from a realistic truth set and therefore can be used to analyze any DIR system and expose intrinsic errors by comparing DVFref and DVFtest. For quantitative error analysis, calculating and delineating differences between DVFs, 2 methods were used, (1) a local error analysis tool that displays deformation error magnitudes with color mapping on each image slice and (2) a global error analysis tool that calculates a deformation error histogram, which describes a cumulative probability function of errors for each anatomical structure. Three digital image phantoms were generated from three patients with a head and neck, a lung and a liver cancer. The DIR QA was evaluated using the case with head and neck. © The Author(s) 2014.

  1. Taqman real-time quantitative PCR for identification of western flower thrip (Frankliniella occidentalis) for plant quarantine

    PubMed Central

    Huang, K. S.; Lee, S. E.; Yeh, Y.; Shen, G. S.; Mei, E.; Chang, C. M.

    2010-01-01

    Western flower thrip (Frankliniella occidentalis) is a major global pest of agricultural products. It directly damages crops through feeding, oviposition activity or transmission of several plant viruses. We describe a Taqman real-time quantitative PCR detection system, which can rapidly identify F. occidentalis from thrips larvae to complement the traditional morphological identification. The data showed that our detection system targeted on the ribosomal RNA gene regions of F. occidentalis has high sensitivity and specificity. The rapid method can be used for on-site testing of samples at ports-of-entry in the future. PMID:20129946

  2. Taqman real-time quantitative PCR for identification of western flower thrip (Frankliniella occidentalis) for plant quarantine.

    PubMed

    Huang, K S; Lee, S E; Yeh, Y; Shen, G S; Mei, E; Chang, C M

    2010-08-23

    Western flower thrip (Frankliniella occidentalis) is a major global pest of agricultural products. It directly damages crops through feeding, oviposition activity or transmission of several plant viruses. We describe a Taqman real-time quantitative PCR detection system, which can rapidly identify F. occidentalis from thrips larvae to complement the traditional morphological identification. The data showed that our detection system targeted on the ribosomal RNA gene regions of F. occidentalis has high sensitivity and specificity. The rapid method can be used for on-site testing of samples at ports-of-entry in the future.

  3. A method to characterize the roughness of 2-D line features: recrystallization boundaries.

    PubMed

    Sun, J; Zhang, Y B; Dahl, A B; Conradsen, K; Juul Jensen, D

    2017-03-01

    A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough features and local curvatures is also presented. The potential of the method is illustrated by quantification of the roughness of two recrystallization boundaries in a pure Al specimen characterized by scanning electron microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  4. Barriers to global health development: An international quantitative survey

    PubMed Central

    2017-01-01

    Background Global health’s goal of reducing low-and-middle-income country versus high-income country health disparities faces complex challenges. Although there have been discussions of barriers, there has not been a broad-based, quantitative survey of such barriers. Methods 432 global health professionals were invited via email to participate in an online survey, with 268 (62%) participating. The survey assessed participants’ (A) demographic and global health background, (B) perceptions regarding 66 barriers’ seriousness, (C) detailed ratings of barriers designated most serious, (D) potential solutions. Results Thirty-four (of 66) barriers were seen as moderately or more serious, highlighting the widespread, significant challenges global health development faces. Perceived barrier seriousness differed significantly across domains: Resource Limitations mean = 2.47 (0–4 Likert scale), Priority Selection mean = 2.20, Corruption, Lack of Competence mean = 1.87, Social and Cultural Barriers mean = 1.68. Some system-level predictors showed significant but relatively limited relations. For instance, for Global Health Domain, HIV and Mental Health had higher levels of perceived Social and Cultural Barriers than other GH Domains. Individual–level global health experience predictors had small but significant effects, with seriousness of (a) Corruption, Lack of Competence, and (b) Priority Selection barriers positively correlated with respondents’ level of LMIC-oriented (e.g., weeks/year spent in LMIC) but Academic Global Health Achievement (e.g., number of global health publications) negatively correlated with overall barrier seriousness. Conclusions That comparatively few system-level predictors (e.g., Organization Type) were significant suggests these barriers may be relatively fundamental at the system-level. Individual-level and system-level effects do have policy implications; e.g., Priority Selection barriers were among the most serious, yet effects on seriousness of how LMIC-oriented a professional was versus level of academic global health achievement ran in opposite directions, suggesting increased discussion of priorities between LMIC-based and other professionals may be useful. It is hoped the 22 suggested solutions will provide useful ideas for addressing global health barriers. PMID:28972971

  5. Prediction of Coronal Mass Ejections From Vector Magnetograms: Quantitative Measures as Predictors

    NASA Technical Reports Server (NTRS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    We derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (I(sub N)), and 2) the length of strong-shear, strong-field main neutral line (Lss), and used these two measures in a pilot study of the CME productivity of 4 active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU, we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (I(sub N) and L(sub ss)) as well as two new ones, the total magnetic flux (PHI) (a measure of an active region's size), and the normalized twist (alpha (bar)= muIN/PHI). We found that the three quantitative measures of global nonpotentiality (I(sub N), L(sub ss), alpha (bar)) were all well correlated (greater than 99% confidence level) with an active region's CME productivity within plus or minus 2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is funded by NSF through the Space Weather Program and by NASA through the Solar Physics Supporting Research and Technology Program.

  6. Global Lithospheric Apparent Susceptibility Distribution Converted from Geomagnetic Models by CHAMP and Swarm Satellite Magnetic Measurements

    NASA Astrophysics Data System (ADS)

    Du, Jinsong; Chen, Chao; Xiong, Xiong; Li, Yongdong; Liang, Qing

    2016-04-01

    Recently, because of continually accumulated magnetic measurements by CHAMP satellite and Swarm constellation of three satellites and well developed methodologies and techniques of data processing and geomagnetic field modeling etc., global lithospheric magnetic anomaly field models become more and more reliable. This makes the quantitative interpretation of lithospheric magnetic anomaly field possible for having an insight into large-scale magnetic structures in the crust and uppermost mantle. Many different approaches have been utilized to understand the magnetized sources, such as forward, inversion, statistics, correlation analysis, Euler deconvolution, signal transformations etc. Among all quantitative interpretation methods, the directly converting a magnetic anomaly map into a magnetic susceptibility anomaly map proposed by Arkani-Hamed & Strangway (1985) is, we think, the most fast quantitative interpretation tool for global studies. We just call this method AS85 hereinafter for short. Although Gubbins et al. (2011) provided a formula to directly calculate the apparent magnetic vector distribution, the AS85 method introduced constraints of magnetized direction and thus corresponding results are expected to be more robust especially in world-wide continents. Therefore, in this study, we first improved the AS85 method further considering non-axial dipolar inducing field using formulae by Nolte & Siebert (1987), initial model or priori information for starting coefficients in the apparent susceptibility conversion, hidden longest-wavelength components of lithospheric magnetic field and field contaminations from global oceanic remanent magnetization. Then, we used the vertically integrated susceptibility model by Hemant & Maus (2005) and vertically integrated remanent magnetization model by Masterton et al. (2013) to test the validity of our improved method. Subsequently, we applied the conversion method to geomagnetic field models by CHAMP and Swarm satellite magnetic measurements and obtained global lithospheric apparent susceptibility distribution models. Finally, we compared these deduced models with previous results in the literature and some other geophysical, geodetic and geologic datum. Both tests and applications suggest, indeed, that the improved AS85 method can be adopted as a fast and effective interpretation tool of global induced large-scale magnetic anomaly field models in form of spherical harmonics. Arkani-Hamed, J. & Srangway, D.W., 1985. Lateral variations of apparent magnetic susceptibility of lithosphere deduced from Magsat data, J. Geophys. Res., 90(B3), 2655-2664. Gubbins, D., Ivers, D., Masterton, S.M. & Winch, D.E., 2011. Analysis of lithospheric magnetization in vector spherical harmonics, Geophys. J. Int., 187(1), 99-117. Hemant, K. & Maus, S., 2005. Geological modeling of the new CHAMP magnetic anomaly maps using a geographical information system technique, J. Geophys. Res., 110, B12103, doi: 10.1029/2005JB003837. Masterton, S.M., Gubbins, D., Müller, R.D. & Singh, K.H., 2013. Forward modeling of oceanic lithospheric magnetization, Geophys. J. Int., 192(3), 951-962. Nolte, H.J. & Siebert, M., 1987. An analytical approach to the magnetic field of the Earth's crust, J. Geophys., 61, 69-76. This study is supported by State Key Laboratory of Geodesy and Earth's Dynamics (Institute of Geodesy and Geophysics, Chinese Academy of Sciences) (SKLGED2015-5-5-EZ), Natural Science Fund of Hubei Province (2015CFB361), International Cooperation Project in Science and Technology of China (2010DFA24580), China Postdoctoral Science Foundation (2015M572217 and 2014T70753), Hubei Subsurface Multi-scale Imaging Key Laboratory (Institute of Geophysics and Geomatics, China University of Geosciences, Wuhan) (SMIL-2015-06) and National Natural Science Foundation of China (41574070, 41104048 and 41504065).

  7. Maximum entropy estimation of a Benzene contaminated plume using ecotoxicological assays.

    PubMed

    Wahyudi, Agung; Bartzke, Mariana; Küster, Eberhard; Bogaert, Patrick

    2013-01-01

    Ecotoxicological bioassays, e.g. based on Danio rerio teratogenicity (DarT) or the acute luminescence inhibition with Vibrio fischeri, could potentially lead to significant benefits for detecting on site contaminations on qualitative or semi-quantitative bases. The aim was to use the observed effects of two ecotoxicological assays for estimating the extent of a Benzene groundwater contamination plume. We used a Maximum Entropy (MaxEnt) method to rebuild a bivariate probability table that links the observed toxicity from the bioassays with Benzene concentrations. Compared with direct mapping of the contamination plume as obtained from groundwater samples, the MaxEnt concentration map exhibits on average slightly higher concentrations though the global pattern is close to it. This suggest MaxEnt is a valuable method to build a relationship between quantitative data, e.g. contaminant concentrations, and more qualitative or indirect measurements, in a spatial mapping framework, which is especially useful when clear quantitative relation is not at hand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Global human capital: integrating education and population.

    PubMed

    Lutz, Wolfgang; KC, Samir

    2011-07-29

    Almost universally, women with higher levels of education have fewer children. Better education is associated with lower mortality, better health, and different migration patterns. Hence, the global population outlook depends greatly on further progress in education, particularly of young women. By 2050, the highest and lowest education scenarios--assuming identical education-specific fertility rates--result in world population sizes of 8.9 and 10.0 billion, respectively. Better education also matters for human development, including health, economic growth, and democracy. Existing methods of multi-state demography can quantitatively integrate education into standard demographic analysis, thus adding the "quality" dimension.

  9. High-throughput, label-free, single-cell, microalgal lipid screening by machine-learning-equipped optofluidic time-stretch quantitative phase microscopy.

    PubMed

    Guo, Baoshan; Lei, Cheng; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke

    2017-05-01

    The development of reliable, sustainable, and economical sources of alternative fuels to petroleum is required to tackle the global energy crisis. One such alternative is microalgal biofuel, which is expected to play a key role in reducing the detrimental effects of global warming as microalgae absorb atmospheric CO 2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid amounts and fail to characterize a diverse population of microalgal cells with single-cell resolution in a non-invasive and interference-free manner. Here high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy was demonstrated. In particular, Euglena gracilis, an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement), within lipid droplets was investigated. The optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch quantitative phase microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase maps of every single cell at a high throughput of 10,000 cells/s, enabling accurate cell classification without the need for fluorescent staining. Specifically, the dataset was used to characterize heterogeneous populations of E. gracilis cells under two different culture conditions (nitrogen-sufficient and nitrogen-deficient) and achieve the cell classification with an error rate of only 2.15%. The method holds promise as an effective analytical tool for microalgae-based biofuel production. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  10. Development and application of a DNA microarray-based yeast two-hybrid system

    PubMed Central

    Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.

    2013-01-01

    The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563

  11. Charting organellar importomes by quantitative mass spectrometry

    PubMed Central

    Peikert, Christian D.; Mani, Jan; Morgenstern, Marcel; Käser, Sandro; Knapp, Bettina; Wenger, Christoph; Harsman, Anke; Oeljeklaus, Silke; Schneider, André; Warscheid, Bettina

    2017-01-01

    Protein import into organelles is essential for all eukaryotes and facilitated by multi-protein translocation machineries. Analysing whether a protein is transported into an organelle is largely restricted to single constituents. This renders knowledge about imported proteins incomplete, limiting our understanding of organellar biogenesis and function. Here we introduce a method that enables charting an organelle's importome. The approach relies on inducible RNAi-mediated knockdown of an essential subunit of a translocase to impair import and quantitative mass spectrometry. To highlight its potential, we established the mitochondrial importome of Trypanosoma brucei, comprising 1,120 proteins including 331 new candidates. Furthermore, the method allows for the identification of proteins with dual or multiple locations and the substrates of distinct protein import pathways. We demonstrate the specificity and versatility of this ImportOmics method by targeting import factors in mitochondria and glycosomes, which demonstrates its potential for globally studying protein import and inventories of organelles. PMID:28485388

  12. The Post-9/11 GI Bill: Insights from Veterans Using Department of Veterans Affairs Educational Benefits

    ERIC Educational Resources Information Center

    Bell, Geri L.; Boland, Elizabeth A.; Dudgeon, Brian; Johnson, Kurt

    2013-01-01

    Because the Post-9/11 GI Bill was implemented in August of 2009, increasing numbers of veterans returning from the Global War on Terror (GWT) have drawn on Department of Veterans Affairs (VA) educational benefits. Based on the findings of a mixed-methods study, quantitative and qualitative survey responses from veterans enrolled at a major…

  13. Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection

    NASA Technical Reports Server (NTRS)

    Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.

    2017-01-01

    During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.

  14. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    PubMed

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  15. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  16. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  17. Global and 3D Spatial Assessment of Neuroinflammation in Rodent Models of Multiple Sclerosis

    PubMed Central

    Gupta, Shashank; Utoft, Regine; Hasseldam, Henrik; Schmidt-Christensen, Anja; Hannibal, Tine Dahlbaek; Hansen, Lisbeth; Fransén-Pettersson, Nina; Agarwal-Gupta, Noopur; Rozell, Björn; Andersson, Åsa; Holmberg, Dan

    2013-01-01

    Multiple Sclerosis (MS) is a progressive autoimmune inflammatory and demyelinating disease of the central nervous system (CNS). T cells play a key role in the progression of neuroinflammation in MS and also in the experimental autoimmune encephalomyelitis (EAE) animal models for the disease. A technology for quantitative and 3 dimensional (3D) spatial assessment of inflammation in this and other CNS inflammatory conditions is much needed. Here we present a procedure for 3D spatial assessment and global quantification of the development of neuroinflammation based on Optical Projection Tomography (OPT). Applying this approach to the analysis of rodent models of MS, we provide global quantitative data of the major inflammatory component as a function of the clinical course. Our data demonstrates a strong correlation between the development and progression of neuroinflammation and clinical disease in several mouse and a rat model of MS refining the information regarding the spatial dynamics of the inflammatory component in EAE. This method provides a powerful tool to investigate the effect of environmental and genetic forces and for assessing the therapeutic effects of drug therapy in animal models of MS and other neuroinflammatory/neurodegenerative disorders. PMID:24124545

  18. 'Talk to me': a mixed methods study on preferred physician behaviours during end-of-life communication from the patient perspective.

    PubMed

    Abdul-Razzak, Amane; Sherifali, Diana; You, John; Simon, Jessica; Brazil, Kevin

    2016-08-01

    Despite the recognized importance of end-of-life (EOL) communication between patients and physicians, the extent and quality of such communication is lacking. We sought to understand patient perspectives on physician behaviours during EOL communication. In this mixed methods study, we conducted quantitative and qualitative strands and then merged data sets during a mixed methods analysis phase. In the quantitative strand, we used the quality of communication tool (QOC) to measure physician behaviours that predict global rating of satisfaction in EOL communication skills, while in the qualitative strand we conducted semi-structured interviews. During the mixed methods analysis, we compared and contrasted qualitative and quantitative data. Seriously ill inpatients at three tertiary care hospitals in Canada. We found convergence between qualitative and quantitative strands: patients desire candid information from their physician and a sense of familiarity. The quantitative results (n = 132) suggest a paucity of certain EOL communication behaviours in this seriously ill population with a limited prognosis. The qualitative findings (n = 16) suggest that at times, physicians did not engage in EOL communication despite patient readiness, while sometimes this may represent an appropriate deferral after assessment of a patient's lack of readiness. Avoidance of certain EOL topics may not always be a failure if it is a result of an assessment of lack of patient readiness. This has implications for future tool development: a measure could be built in to assess whether physician behaviours align with patient readiness. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  19. Modeling and simulation of surfactant-polymer flooding using a new hybrid method

    NASA Astrophysics Data System (ADS)

    Daripa, Prabir; Dutta, Sourav

    2017-04-01

    Chemical enhanced oil recovery by surfactant-polymer (SP) flooding has been studied in two space dimensions. A new global pressure for incompressible, immiscible, multicomponent two-phase porous media flow has been derived in the context of SP flooding. This has been used to formulate a system of flow equations that incorporates the effect of capillary pressure and also the effect of polymer and surfactant on viscosity, interfacial tension and relative permeabilities of the two phases. The coupled system of equations for pressure, water saturation, polymer concentration and surfactant concentration has been solved using a new hybrid method in which the elliptic global pressure equation is solved using a discontinuous finite element method and the transport equations for water saturation and concentrations of the components are solved by a Modified Method Of Characteristics (MMOC) in the multicomponent setting. Numerical simulations have been performed to validate the method, both qualitatively and quantitatively, and to evaluate the relative performance of the various flooding schemes for several different heterogeneous reservoirs.

  20. Local facet approximation for image stitching

    NASA Astrophysics Data System (ADS)

    Li, Jing; Lai, Shiming; Liu, Yu; Wang, Zhengming; Zhang, Maojun

    2018-01-01

    Image stitching aims at eliminating multiview parallax and generating a seamless panorama given a set of input images. This paper proposes a local adaptive stitching method, which could achieve both accurate and robust image alignments across the whole panorama. A transformation estimation model is introduced by approximating the scene as a combination of neighboring facets. Then, the local adaptive stitching field is constructed using a series of linear systems of the facet parameters, which enables the parallax handling in three-dimensional space. We also provide a concise but effective global projectivity preserving technique that smoothly varies the transformations from local adaptive to global planar. The proposed model is capable of stitching both normal images and fisheye images. The efficiency of our method is quantitatively demonstrated in the comparative experiments on several challenging cases.

  1. Evaluation of fuzzy inference systems using fuzzy least squares

    NASA Technical Reports Server (NTRS)

    Barone, Joseph M.

    1992-01-01

    Efforts to develop evaluation methods for fuzzy inference systems which are not based on crisp, quantitative data or processes (i.e., where the phenomenon the system is built to describe or control is inherently fuzzy) are just beginning. This paper suggests that the method of fuzzy least squares can be used to perform such evaluations. Regressing the desired outputs onto the inferred outputs can provide both global and local measures of success. The global measures have some value in an absolute sense, but they are particularly useful when competing solutions (e.g., different numbers of rules, different fuzzy input partitions) are being compared. The local measure described here can be used to identify specific areas of poor fit where special measures (e.g., the use of emphatic or suppressive rules) can be applied. Several examples are discussed which illustrate the applicability of the method as an evaluation tool.

  2. Desirability-based methods of multiobjective optimization and ranking for global QSAR studies. Filtering safe and potent drug candidates from combinatorial libraries.

    PubMed

    Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M Natália D S; Cagide Fajin, J Luis; Morell, Carlos; Ruiz, Reinaldo Molina; Cañizares-Carmenate, Yudith; Dominguez, Elena Rosa

    2008-01-01

    Up to now, very few applications of multiobjective optimization (MOOP) techniques to quantitative structure-activity relationship (QSAR) studies have been reported in the literature. However, none of them report the optimization of objectives related directly to the final pharmaceutical profile of a drug. In this paper, a MOOP method based on Derringer's desirability function that allows conducting global QSAR studies, simultaneously considering the potency, bioavailability, and safety of a set of drug candidates, is introduced. The results of the desirability-based MOOP (the levels of the predictor variables concurrently producing the best possible compromise between the properties determining an optimal drug candidate) are used for the implementation of a ranking method that is also based on the application of desirability functions. This method allows ranking drug candidates with unknown pharmaceutical properties from combinatorial libraries according to the degree of similarity with the previously determined optimal candidate. Application of this method will make it possible to filter the most promising drug candidates of a library (the best-ranked candidates), which should have the best pharmaceutical profile (the best compromise between potency, safety and bioavailability). In addition, a validation method of the ranking process, as well as a quantitative measure of the quality of a ranking, the ranking quality index (Psi), is proposed. The usefulness of the desirability-based methods of MOOP and ranking is demonstrated by its application to a library of 95 fluoroquinolones, reporting their gram-negative antibacterial activity and mammalian cell cytotoxicity. Finally, the combined use of the desirability-based methods of MOOP and ranking proposed here seems to be a valuable tool for rational drug discovery and development.

  3. Networking among young global health researchers through an intensive training approach: a mixed methods exploratory study

    PubMed Central

    2014-01-01

    Background Networks are increasingly regarded as essential in health research aimed at influencing practice and policies. Less research has focused on the role networking can play in researchers’ careers and its broader impacts on capacity strengthening in health research. We used the Canadian Coalition for Global Health Research (CCGHR) annual Summer Institute for New Global Health Researchers (SIs) as an opportunity to explore networking among new global health researchers. Methods A mixed-methods exploratory study was conducted among SI alumni and facilitators who had participated in at least one SI between 2004 and 2010. Alumni and facilitators completed an online short questionnaire, and a subset participated in an in-depth interview. Thematic analysis of the qualitative data was triangulated with quantitative results and CCGHR reports on SIs. Synthesis occurred through the development of a process model relevant to networking through the SIs. Results Through networking at the SIs, participants experienced decreased isolation and strengthened working relationships. Participants accessed new knowledge, opportunities, and resources through networking during the SI. Post-SI, participants reported ongoing contact and collaboration, although most participants desired more opportunities for interaction. They made suggestions for structural supports to networking among new global health researchers. Conclusions Networking at the SI contributed positively to opportunities for individuals, and contributed to the formation of a network of global health researchers. Intentional inclusion of networking in health research capacity strengthening initiatives, with supportive resources and infrastructure could create dynamic, sustainable networks accessible to global health researchers around the world. PMID:24460819

  4. Seasonal Variability in Global Eddy Diffusion and the Effect on Thermospheric Neutral Density

    NASA Astrophysics Data System (ADS)

    Pilinski, M.; Crowley, G.

    2014-12-01

    We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time between January 2004 and January 2008 were estimated from residuals of neutral density measurements made by the CHallenging Minisatellite Payload (CHAMP) and simulations made using the Thermosphere Ionosphere Mesosphere Electrodynamics - Global Circulation Model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy-diffusivity models. The eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the RMS difference between the TIME-GCM model and density data from a variety of satellites is reduced by an average of 5%. This result, indicates that global thermospheric density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates how eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are some limitations of this method, which are discussed, including that the latitude-dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion consistent with diffusion observations made by other techniques.

  5. Seasonal variability in global eddy diffusion and the effect on neutral density

    NASA Astrophysics Data System (ADS)

    Pilinski, M. D.; Crowley, G.

    2015-04-01

    We describe a method for making single-satellite estimates of the seasonal variability in global-average eddy diffusion coefficients. Eddy diffusion values as a function of time were estimated from residuals of neutral density measurements made by the Challenging Minisatellite Payload (CHAMP) and simulations made using the thermosphere-ionosphere-mesosphere electrodynamics global circulation model (TIME-GCM). The eddy diffusion coefficient results are quantitatively consistent with previous estimates based on satellite drag observations and are qualitatively consistent with other measurement methods such as sodium lidar observations and eddy diffusivity models. Eddy diffusion coefficient values estimated between January 2004 and January 2008 were then used to generate new TIME-GCM results. Based on these results, the root-mean-square sum for the TIME-GCM model is reduced by an average of 5% when compared to density data from a variety of satellites, indicating that the fidelity of global density modeling can be improved by using data from a single satellite like CHAMP. This approach also demonstrates that eddy diffusion could be estimated in near real-time from satellite observations and used to drive a global circulation model like TIME-GCM. Although the use of global values improves modeled neutral densities, there are limitations to this method, which are discussed, including that the latitude dependence of the seasonal neutral-density signal is not completely captured by a global variation of eddy diffusion coefficients. This demonstrates the need for a latitude-dependent specification of eddy diffusion which is also consistent with diffusion observations made by other techniques.

  6. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.

  7. Prediction of Coronal Mass Ejections from Vector Magnetograms: Quantitative Measures as Predictors

    NASA Astrophysics Data System (ADS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.

    2001-05-01

    In a pilot study of 4 active regions (Falconer, D.A. 2001, JGR, in press), we derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (IN), and 2) the length of the strong-shear, strong-field main neutral line (LSS), and used these two measures of the CME productivity of the active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU (Falconer, Moore, & Gary, 2000, EOS 81, 48 F998), we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (IN and LSS) as well as two new ones, the total magnetic flux (Φ ) (a measure of an active region's size), and the normalized twist (α =μ IN/Φ ). We found that the three measures of global nonpotentiality (IN, LSS, α ) were all well correlated (>99% confidence level) with an active region's CME productivity within (2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is funded by NSF through the Space Weather Program and by NASA through the Solar Physics Supporting Research and Technology Program.

  8. Automatic 3D liver segmentation based on deep learning and globally optimized surface evolution

    NASA Astrophysics Data System (ADS)

    Hu, Peijun; Wu, Fa; Peng, Jialin; Liang, Ping; Kong, Dexing

    2016-12-01

    The detection and delineation of the liver from abdominal 3D computed tomography (CT) images are fundamental tasks in computer-assisted liver surgery planning. However, automatic and accurate segmentation, especially liver detection, remains challenging due to complex backgrounds, ambiguous boundaries, heterogeneous appearances and highly varied shapes of the liver. To address these difficulties, we propose an automatic segmentation framework based on 3D convolutional neural network (CNN) and globally optimized surface evolution. First, a deep 3D CNN is trained to learn a subject-specific probability map of the liver, which gives the initial surface and acts as a shape prior in the following segmentation step. Then, both global and local appearance information from the prior segmentation are adaptively incorporated into a segmentation model, which is globally optimized in a surface evolution way. The proposed method has been validated on 42 CT images from the public Sliver07 database and local hospitals. On the Sliver07 online testing set, the proposed method can achieve an overall score of 80.3+/- 4.5 , yielding a mean Dice similarity coefficient of 97.25+/- 0.65 % , and an average symmetric surface distance of 0.84+/- 0.25 mm. The quantitative validations and comparisons show that the proposed method is accurate and effective for clinical application.

  9. Towards machine ecoregionalization of Earth's landmass using pattern segmentation method

    NASA Astrophysics Data System (ADS)

    Nowosad, Jakub; Stepinski, Tomasz F.

    2018-07-01

    We present and evaluate a quantitative method for delineation of ecophysiographic regions throughout the entire terrestrial landmass. The method uses the new pattern-based segmentation technique which attempts to emulate the qualitative, weight-of-evidence approach to a delineation of ecoregions in a computer code. An ecophysiographic region is characterized by homogeneous physiography defined by the cohesiveness of patterns of four variables: land cover, soils, landforms, and climatic patterns. Homogeneous physiography is a necessary but not sufficient condition for a region to be an ecoregion, thus machine delineation of ecophysiographic regions is the first, important step toward global ecoregionalization. In this paper, we focus on the first-order approximation of the proposed method - delineation on the basis of the patterns of the land cover alone. We justify this approximation by the existence of significant spatial associations between various physiographic variables. Resulting ecophysiographic regionalization (ECOR) is shown to be more physiographically homogeneous than existing global ecoregionalizations (Terrestrial Ecoregions of the World (TEW) and Bailey's Ecoregions of the Continents (BEC)). The presented quantitative method has an advantage of being transparent and objective. It can be verified, easily updated, modified and customized for specific applications. Each region in ECOR contains detailed, SQL-searchable information about physiographic patterns within it. It also has a computer-generated label. To give a sense of how ECOR compares to TEW and, in the U.S., to EPA Level III ecoregions, we contrast these different delineations using two specific sites as examples. We conclude that ECOR yields regionalization somewhat similar to EPA level III ecoregions, but for the entire world, and by automatic means.

  10. Evaluating the impact of climate change on landslide occurrence, hazard, and risk: from global to regional scale.

    NASA Astrophysics Data System (ADS)

    Gariano, Stefano Luigi; Guzzetti, Fausto

    2017-04-01

    According to the fifth report of the Intergovernmental Panel on Climate Change, "warming of the climate system is unequivocal". The influence of climate changes on slope stability and landslides is also undisputable. Nevertheless, the quantitative evaluation of the impact of global warming, and the related changes in climate, on landslides remains a complex question to be solved. The evidence that climate and landslides act at only partially overlapping spatial and temporal scales complicates the evaluation. Different research fields, including e.g., climatology, physics, hydrology, geology, hydrogeology, geotechnics, soil science, environmental science, and social science, must be considered. Climatic, environmental, demographic, and economic changes are strictly correlated, with complex feedbacks, to landslide occurrence and variation. Thus, a holistic, multidisciplinary approach is necessary. We reviewed the literature on landslide-climate studies, and found a bias in their geographical distribution, with several studies centered in Europe and North America, and large parts of the world not investigated. We examined advantages and drawbacks of the approaches adopted to evaluate the effects of climate variations on landslides, including prospective modelling and retrospective methods that use landslide and climate records, and paleo-environmental information. We found that the results of landslide-climate studies depend more on the emission scenarios, the global circulation models, the regional climate models, and the methods to downscale the climate variables, than on the description of the variables controlling slope processes. Using ensembles of projections based on a range of emissions scenarios would reduce (or at least quantify) the uncertainties in the obtained results. We performed a preliminary global assessment of the future landslide impact, presenting a global distribution of the projected impact of climate change on landslide activity and abundance. Where global warming is expected to increase, the frequency and intensity of severe rainfall events, a primary trigger of shallow, rapid-moving landslides that cause many landslide fatalities, an increase in the number of people exposed to landslide risk is to be expected. Furthermore, we defined a group of objective and reproducible methods for the quantitative evaluation of the past and future (expected) variations in landslide occurrence and distribution, and in the impact and risk to the population, as a result of changes in climatic and environmental factors (particularly, land use changes), at regional scale. The methods were tested in a southern Italian region, but they can easily applied in other physiographic and climatic regions, where adequate information is available.

  11. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  12. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  13. The 2006 William Feinberg lecture: shifting the paradigm from stroke to global vascular risk estimation.

    PubMed

    Sacco, Ralph L

    2007-06-01

    By the year 2010, it is estimated that 18.1 million people worldwide will die annually because of cardiovascular diseases and stroke. "Global vascular risk" more broadly includes the multiple overlapping disease silos of stroke, myocardial infarction, peripheral arterial disease, and vascular death. Estimation of global vascular risk requires consideration of a variety of variables including demographics, environmental behaviors, and risk factors. Data from multiple studies suggest continuous linear relationships between the physiological vascular risk modulators of blood pressure, lipids, and blood glucose rather than treating these conditions as categorical risk factors. Constellations of risk factors may be more relevant than individual categorical components. Exciting work with novel risk factors may also have predictive value in estimates of global vascular risk. Advances in imaging have led to the measurement of subclinical conditions such as carotid intima-media thickness and subclinical brain conditions such as white matter hyperintensities and silent infarcts. These subclinical measurements may be intermediate stages in the transition from asymptomatic to symptomatic vascular events, appear to be associated with the fundamental vascular risk factors, and represent opportunities to more precisely quantitate disease progression. The expansion of studies in molecular epidemiology and detection of genetic markers underlying vascular risks also promises to extend our precision of global vascular risk estimation. Global vascular risk estimation will require quantitative methods that bundle these multi-dimensional data into more precise estimates of future risk. The power of genetic information coupled with data on demographics, risk-inducing behaviors, vascular risk modulators, biomarkers, and measures of subclinical conditions should provide the most realistic approximation of an individual's future global vascular risk. The ultimate public health benefit, however, will depend on not only identification of global vascular risk but also the realization that we can modify this risk and prove the prediction models wrong.

  14. Decadal Changes in Global Ocean Annual Primary Production

    NASA Technical Reports Server (NTRS)

    Gregg, Watson; Conkright, Margarita E.; Behrenfeld, Michael J.; Ginoux, Paul; Casey, Nancy W.; Koblinsky, Chester J. (Technical Monitor)

    2002-01-01

    The Sea-viewing Wide Field-of-View Sensor (SeaWiFS) has produced the first multi-year time series of global ocean chlorophyll observations since the demise of the Coastal Zone Color Scanner (CZCS) in 1986. Global observations from 1997-present from SeaWiFS combined with observations from 1979-1986 from the CZCS should in principle provide an opportunity to observe decadal changes in global ocean annual primary production, since chlorophyll is the primary driver for estimates of primary production. However, incompatibilities between algorithms have so far precluded quantitative analysis. We have developed and applied compatible processing methods for the CZCS, using modern advances in atmospheric correction and consistent bio-optical algorithms to advance the CZCS archive to comparable quality with SeaWiFS. We applied blending methodologies, where in situ data observations are incorporated into the CZCS and SeaWiFS data records, to provide improvement of the residuals. These re-analyzed, blended data records provide maximum compatibility and permit, for the first time, a quantitative analysis of the changes in global ocean primary production in the early-to-mid 1980's and the present, using synoptic satellite observations. An intercomparison of the global and regional primary production from these blended satellite observations is important to understand global climate change and the effects on ocean biota. Photosynthesis by chlorophyll-containing phytoplankton is responsible for biotic uptake of carbon in the oceans and potentially ultimately from the atmosphere. Global ocean annual primary decreased from the CZCS record to SeaWiFS, by nearly 6% from the early 1980s to the present. Annual primary production in the high latitudes was responsible for most of the decadal change. Conversely, primary production in the low latitudes generally increased, with the exception of the tropical Pacific. The differences and similarities of the two data records provide evidence of how the Earth's climate may be changing and how ocean biota respond. Furthermore, the results have implications for the ocean carbon cycle.

  15. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Kane, Staci R.

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. Thesemore » methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.« less

  16. Quantitative petri net model of gene regulated metabolic networks in the cell.

    PubMed

    Chen, Ming; Hofestädt, Ralf

    2011-01-01

    A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.

  17. Quantitative Analysis of Critical Factors for the Climate Impact of Landfill Mining.

    PubMed

    Laner, David; Cencic, Oliver; Svensson, Niclas; Krook, Joakim

    2016-07-05

    Landfill mining has been proposed as an innovative strategy to mitigate environmental risks associated with landfills, to recover secondary raw materials and energy from the deposited waste, and to enable high-valued land uses at the site. The present study quantitatively assesses the importance of specific factors and conditions for the net contribution of landfill mining to global warming using a novel, set-based modeling approach and provides policy recommendations for facilitating the development of projects contributing to global warming mitigation. Building on life-cycle assessment, scenario modeling and sensitivity analysis methods are used to identify critical factors for the climate impact of landfill mining. The net contributions to global warming of the scenarios range from -1550 (saving) to 640 (burden) kg CO2e per Mg of excavated waste. Nearly 90% of the results' total variation can be explained by changes in four factors, namely the landfill gas management in the reference case (i.e., alternative to mining the landfill), the background energy system, the composition of the excavated waste, and the applied waste-to-energy technology. Based on the analyses, circumstances under which landfill mining should be prioritized or not are identified and sensitive parameters for the climate impact assessment of landfill mining are highlighted.

  18. An exploratory sequential design to validate measures of moral emotions.

    PubMed

    Márquez, Margarita G; Delgado, Ana R

    2017-05-01

    This paper presents an exploratory and sequential mixed methods approach in validating measures of knowledge of the moral emotions of contempt, anger and disgust. The sample comprised 60 participants in the qualitative phase when a measurement instrument was designed. Item stems, response options and correction keys were planned following the results obtained in a descriptive phenomenological analysis of the interviews. In the quantitative phase, the scale was used with a sample of 102 Spanish participants, and the results were analysed with the Rasch model. In the qualitative phase, salient themes included reasons, objects and action tendencies. In the quantitative phase, good psychometric properties were obtained. The model fit was adequate. However, some changes had to be made to the scale in order to improve the proportion of variance explained. Substantive and methodological im-plications of this mixed-methods study are discussed. Had the study used a single re-search method in isolation, aspects of the global understanding of contempt, anger and disgust would have been lost.

  19. Networking among young global health researchers through an intensive training approach: a mixed methods exploratory study.

    PubMed

    Lenters, Lindsey M; Cole, Donald C; Godoy-Ruiz, Paula

    2014-01-25

    Networks are increasingly regarded as essential in health research aimed at influencing practice and policies. Less research has focused on the role networking can play in researchers' careers and its broader impacts on capacity strengthening in health research. We used the Canadian Coalition for Global Health Research (CCGHR) annual Summer Institute for New Global Health Researchers (SIs) as an opportunity to explore networking among new global health researchers. A mixed-methods exploratory study was conducted among SI alumni and facilitators who had participated in at least one SI between 2004 and 2010. Alumni and facilitators completed an online short questionnaire, and a subset participated in an in-depth interview. Thematic analysis of the qualitative data was triangulated with quantitative results and CCGHR reports on SIs. Synthesis occurred through the development of a process model relevant to networking through the SIs. Through networking at the SIs, participants experienced decreased isolation and strengthened working relationships. Participants accessed new knowledge, opportunities, and resources through networking during the SI. Post-SI, participants reported ongoing contact and collaboration, although most participants desired more opportunities for interaction. They made suggestions for structural supports to networking among new global health researchers. Networking at the SI contributed positively to opportunities for individuals, and contributed to the formation of a network of global health researchers. Intentional inclusion of networking in health research capacity strengthening initiatives, with supportive resources and infrastructure could create dynamic, sustainable networks accessible to global health researchers around the world.

  20. Quantitative Global Heat Transfer in a Mach-6 Quiet Tunnel

    NASA Technical Reports Server (NTRS)

    Sullivan, John P.; Schneider, Steven P.; Liu, Tianshu; Rubal, Justin; Ward, Chris; Dussling, Joseph; Rice, Cody; Foley, Ryan; Cai, Zeimin; Wang, Bo; hide

    2012-01-01

    This project developed quantitative methods for obtaining heat transfer from temperature sensitive paint (TSP) measurements in the Mach-6 quiet tunnel at Purdue, which is a Ludwieg tube with a downstream valve, moderately-short flow duration and low levels of heat transfer. Previous difficulties with inferring heat transfer from TSP in the Mach-6 quiet tunnel were traced to (1) the large transient heat transfer that occurs during the unusually long tunnel startup and shutdown, (2) the non-uniform thickness of the insulating coating, (3) inconsistencies and imperfections in the painting process and (4) the low levels of heat transfer observed on slender models at typical stagnation temperatures near 430K. Repeated measurements were conducted on 7 degree-half-angle sharp circular cones at zero angle of attack in order to evaluate the techniques, isolate the problems and identify solutions. An attempt at developing a two-color TSP method is also summarized.

  1. Boiling points of halogenated aliphatic compounds: a quantitative structure-property relationship for prediction and validation.

    PubMed

    Oberg, Tomas

    2004-01-01

    Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.

  2. Seismic waveform inversion best practices: regional, global and exploration test cases

    NASA Astrophysics Data System (ADS)

    Modrak, Ryan; Tromp, Jeroen

    2016-09-01

    Reaching the global minimum of a waveform misfit function requires careful choices about the nonlinear optimization, preconditioning and regularization methods underlying an inversion. Because waveform inversion problems are susceptible to erratic convergence associated with strong nonlinearity, one or two test cases are not enough to reliably inform such decisions. We identify best practices, instead, using four seismic near-surface problems, one regional problem and two global problems. To make meaningful quantitative comparisons between methods, we carry out hundreds of inversions, varying one aspect of the implementation at a time. Comparing nonlinear optimization algorithms, we find that limited-memory BFGS provides computational savings over nonlinear conjugate gradient methods in a wide range of test cases. Comparing preconditioners, we show that a new diagonal scaling derived from the adjoint of the forward operator provides better performance than two conventional preconditioning schemes. Comparing regularization strategies, we find that projection, convolution, Tikhonov regularization and total variation regularization are effective in different contexts. Besides questions of one strategy or another, reliability and efficiency in waveform inversion depend on close numerical attention and care. Implementation details involving the line search and restart conditions have a strong effect on computational cost, regardless of the chosen nonlinear optimization algorithm.

  3. Advances in methods for detection of anaerobic ammonium oxidizing (anammox) bacteria.

    PubMed

    Li, Meng; Gu, Ji-Dong

    2011-05-01

    Anaerobic ammonium oxidation (anammox), the biochemical process oxidizing ammonium into dinitrogen gas using nitrite as an electron acceptor, has only been recognized for its significant role in the global nitrogen cycle not long ago, and its ubiquitous distribution in a wide range of environments has changed our knowledge about the contributors to the global nitrogen cycle. Currently, several groups of methods are used in detection of anammox bacteria based on their physiological and biochemical characteristics, cellular chemical composition, and both 16S rRNA gene and selective functional genes as biomarkers, including hydrazine oxidoreductase and nitrite reductase encoding genes hzo and nirS, respectively. Results from these methods coupling with advances in quantitative PCR, reverse transcription of mRNA genes and stable isotope labeling have improved our understanding on the distribution, diversity, and activity of anammox bacteria in different environments both natural and engineered ones. In this review, we summarize these methods used in detection of anammox bacteria from various environments, highlight the strengths and weakness of these methods, and also discuss the new development potentials on the existing and new techniques in the future.

  4. Multiscale moment-based technique for object matching and recognition

    NASA Astrophysics Data System (ADS)

    Thio, HweeLi; Chen, Liya; Teoh, Eam-Khwang

    2000-03-01

    A new method is proposed to extract features from an object for matching and recognition. The features proposed are a combination of local and global characteristics -- local characteristics from the 1-D signature function that is defined to each pixel on the object boundary, global characteristics from the moments that are generated from the signature function. The boundary of the object is first extracted, then the signature function is generated by computing the angle between two lines from every point on the boundary as a function of position along the boundary. This signature function is position, scale and rotation invariant (PSRI). The shape of the signature function is then described quantitatively by using moments. The moments of the signature function are the global characters of a local feature set. Using moments as the eventual features instead of the signature function reduces the time and complexity of an object matching application. Multiscale moments are implemented to produce several sets of moments that will generate more accurate matching. Basically multiscale technique is a coarse to fine procedure and makes the proposed method more robust to noise. This method is proposed to match and recognize objects under simple transformation, such as translation, scale changes, rotation and skewing. A simple logo indexing system is implemented to illustrate the performance of the proposed method.

  5. Are quantitative sensitivity analysis methods always reliable?

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2016-12-01

    Physical parameterizations developed to represent subgrid-scale physical processes include various uncertain parameters, leading to large uncertainties in today's Earth System Models (ESMs). Sensitivity Analysis (SA) is an efficient approach to quantitatively determine how the uncertainty of the evaluation metric can be apportioned to each parameter. Also, SA can identify the most influential parameters, as a result to reduce the high dimensional parametric space. In previous studies, some SA-based approaches, such as Sobol' and Fourier amplitude sensitivity testing (FAST), divide the parameters into sensitive and insensitive groups respectively. The first one is reserved but the other is eliminated for certain scientific study. However, these approaches ignore the disappearance of the interactive effects between the reserved parameters and the eliminated ones, which are also part of the total sensitive indices. Therefore, the wrong sensitive parameters might be identified by these traditional SA approaches and tools. In this study, we propose a dynamic global sensitivity analysis method (DGSAM), which iteratively removes the least important parameter until there are only two parameters left. We use the CLM-CASA, a global terrestrial model, as an example to verify our findings with different sample sizes ranging from 7000 to 280000. The result shows DGSAM has abilities to identify more influential parameters, which is confirmed by parameter calibration experiments using four popular optimization methods. For example, optimization using Top3 parameters filtered by DGSAM could achieve substantial improvement against Sobol' by 10%. Furthermore, the current computational cost for calibration has been reduced to 1/6 of the original one. In future, it is necessary to explore alternative SA methods emphasizing parameter interactions.

  6. Training the next generation of global health advocates through experiential education: A mixed-methods case study evaluation.

    PubMed

    Hoffman, Steven J; Silverberg, Sarah L

    2015-10-15

    This case study evaluates a global health education experience aimed at training the next generation of global health advocates. Demand and interest in global health among Canadian students is well documented, despite the difficulty in integrating meaningful experiences into curricula. Global health advocacy was taught to 19 undergraduate students at McMaster University through an experiential education course, during which they developed a national advocacy campaign on global access to medicines. A quantitative survey and an analysis of social network dynamics were conducted, along with a qualitative analysis of written work and course evaluations. Data were interpreted through a thematic synthesis approach. Themes were identified related to students' learning outcomes, experience and class dynamics. The experiential education format helped students gain authentic, real-world experience in global health advocacy and leadership. The tangible implications for their course work was a key motivating factor. While experiential education is an effective tool for some learning outcomes, it is not suitable for all. As well, group dynamics and evaluation methods affect the learning environment. Real-world global health issues, public health practice and advocacy approaches can be effectively taught through experiential education, alongside skills like communication and professionalism. Students developed a nuanced understanding of many strategies, challenges and barriers that exist in advocating for public health ideas. These experiences are potentially empowering and confidence-building despite the heavy time commitment they require. Attention should be given to how such experiences are designed, as course dynamics and grading structure significantly influence students' experience.

  7. Navigating the network: signaling cross-talk in hematopoietic cells

    PubMed Central

    Fraser, Iain D C; Germain, Ronald N

    2009-01-01

    Recent studies in hematopoietic cells have led to a growing appreciation of the diverse modes of molecular and functional cross-talk between canonical signaling pathways. However, these intersections represent only the tip of the iceberg. Emerging global analytical methods are providing an even richer and more complete picture of the many components that measurably interact in a network manner to produce cellular responses. Here we highlight the pieces in this Focus, emphasize the limitations of the present canonical pathway paradigm, and discuss the value of a systems biology approach using more global, quantitative experimental design and data analysis strategies. Lastly, we urge caution about overly facile interpretation of genome- and proteome-level studies. PMID:19295628

  8. 3D optic disc reconstruction via a global fundus stereo algorithm.

    PubMed

    Bansal, M; Sizintsev, M; Eledath, J; Sawhney, H; Pearson, D J; Stone, R A

    2013-01-01

    This paper presents a novel method to recover 3D structure of the optic disc in the retina from two uncalibrated fundus images. Retinal images are commonly uncalibrated when acquired clinically, creating rectification challenges as well as significant radiometric and blur differences within the stereo pair. By exploiting structural peculiarities of the retina, we modified the Graph Cuts computational stereo method (one of current state-of-the-art methods) to yield a high quality algorithm for fundus stereo reconstruction. Extensive qualitative and quantitative experimental evaluation (where OCT scans are used as 3D ground truth) on our and publicly available datasets shows the superiority of the proposed method in comparison to other alternatives.

  9. The application of systems thinking in health: why use systems thinking?

    PubMed

    Peters, David H

    2014-08-26

    This paper explores the question of what systems thinking adds to the field of global health. Observing that elements of systems thinking are already common in public health research, the article discusses which of the large body of theories, methods, and tools associated with systems thinking are more useful. The paper reviews the origins of systems thinking, describing a range of the theories, methods, and tools. A common thread is the idea that the behavior of systems is governed by common principles that can be discovered and expressed. They each address problems of complexity, which is a frequent challenge in global health. The different methods and tools are suited to different types of inquiry and involve both qualitative and quantitative techniques. The paper concludes by emphasizing that explicit models used in systems thinking provide new opportunities to understand and continuously test and revise our understanding of the nature of things, including how to intervene to improve people's health.

  10. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Global Profiling of Reactive Oxygen and Nitrogen Species in Biological Systems

    PubMed Central

    Zielonka, Jacek; Zielonka, Monika; Sikora, Adam; Adamus, Jan; Joseph, Joy; Hardy, Micael; Ouari, Olivier; Dranka, Brian P.; Kalyanaraman, Balaraman

    2012-01-01

    Herein we describe a high-throughput fluorescence and HPLC-based methodology for global profiling of reactive oxygen and nitrogen species (ROS/RNS) in biological systems. The combined use of HPLC and fluorescence detection is key to successful implementation and validation of this methodology. Included here are methods to specifically detect and quantitate the products formed from interaction between the ROS/RNS species and the fluorogenic probes, as follows: superoxide using hydroethidine, peroxynitrite using boronate-based probes, nitric oxide-derived nitrosating species with 4,5-diaminofluorescein, and hydrogen peroxide and other oxidants using 10-acetyl-3,7-dihydroxyphenoxazine (Amplex® Red) with and without horseradish peroxidase, respectively. In this study, we demonstrate real-time monitoring of ROS/RNS in activated macrophages using high-throughput fluorescence and HPLC methods. This global profiling approach, simultaneous detection of multiple ROS/RNS products of fluorescent probes, developed in this study will be useful in unraveling the complex role of ROS/RNS in redox regulation, cell signaling, and cellular oxidative processes and in high-throughput screening of anti-inflammatory antioxidants. PMID:22139901

  12. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  13. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  14. Agreement between the methods: Subjective Global Nutritional Assessment and the nutritional assessment of the World Health Organization.

    PubMed

    Pimenta, Fabiana S; Oliveira, Cássia M; Hattori, Wallisen T; Teixeira, Kely R

    2017-11-12

    To assess the agreement between the results of the Subjective Global Nutritional Assessment questionnaire, adapted for children and adolescents of the Brazilian population, and the nutritional status assessment method through growth curves and the classification of the World Health Organization in a pediatric hospital service. This was an analytical, quantitative, cross-sectional study. During the data collection period, the nutritional status of all patients from 0 to 12 years of age, admitted to the pediatric unit of a university hospital, was concomitantly assessed according to the Subjective Global Nutritional Assessment and World Health Organization curves. To determine the assessment and agreement between these methods, the Kappa and Kendall coefficients were used, respectively, considering a significance level of 5%. Sixty-one children participated, with a predominance of males. It was observed that the highest frequency of equivalent results occurred among the group classified as well nourished, and that only the height/age variable showed a close agreement between the methods. Additionally, there was a good correlation only for the weight/height variable between the assessment tools used. Due to the low agreement between the methods, the combination of both may be beneficial for the nutritional assessment of pediatric patients, collaborating with the early diagnosis of nutritional alterations and facilitating the use of adequate dietary therapy. Copyright © 2017. Published by Elsevier Editora Ltda.

  15. Spillover systems in a telecoupled Anthropocene: typology, methods, and governance for global sustainability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jianguo; Dou, Yue; Batistella, Mateus

    The world has become increasingly telecoupled through distant flows of information, energy, people, organisms, goods, and matter. Recent advances suggest that telecouplings such as trade and species invasion often generate spillover systems with profound effects. To untangle spillover complexity, we make the first attempt to develop a typology of spillover systems based on six criteria: flows from and to sending and receiving systems, distances from sending and receiving systems, types of spillover effects, sizes of spillover systems, roles of agents in spillover systems, and the origin of spillover systems. Furthermore, we highlight a portfolio of qualitative and quantitative methods formore » detecting the often-overlooked spillover systems. To effectively govern spillover systems for global sustainability, we also propose an overall goal (minimize negative and maximize positive spillover effects) and three general principles (fairness, responsibility, and capability).« less

  16. Spillover systems in a telecoupled Anthropocene: typology, methods, and governance for global sustainability

    DOE PAGES

    Liu, Jianguo; Dou, Yue; Batistella, Mateus; ...

    2018-05-05

    The world has become increasingly telecoupled through distant flows of information, energy, people, organisms, goods, and matter. Recent advances suggest that telecouplings such as trade and species invasion often generate spillover systems with profound effects. To untangle spillover complexity, we make the first attempt to develop a typology of spillover systems based on six criteria: flows from and to sending and receiving systems, distances from sending and receiving systems, types of spillover effects, sizes of spillover systems, roles of agents in spillover systems, and the origin of spillover systems. Furthermore, we highlight a portfolio of qualitative and quantitative methods formore » detecting the often-overlooked spillover systems. To effectively govern spillover systems for global sustainability, we also propose an overall goal (minimize negative and maximize positive spillover effects) and three general principles (fairness, responsibility, and capability).« less

  17. Does remote sensing help translating local SGD investigation to large spatial scales?

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Mallast, U.; Hennig, H.; Schubert, M.; Knoeller, K.; Neehaul, Y.

    2016-02-01

    Within the last 20 years, studies on submarine groundwater discharge (SGD) have revealed numerous processes, temporal behavior and quantitative estimations as well as best-practice and localization methods. This plethora on information is valuable regarding the understanding of magnitude and effects of SGD for the respective location. Yet, since given local conditions vary, the translation of local understanding, magnitudes and effects to a regional or global scale is not trivial. In contrast, modeling approaches (e.g. 228Ra budget) tackling SGD on a global scale do provide quantitative global estimates but have not been related to local investigations. This gap between the two approaches, local and global, and the combination and/or translation of either one to the other represents one of the mayor challenges the SGD community currently faces. But what if remote sensing can provide certain information that may be used as translation between the two, similar to transfer functions in many other disciplines allowing an extrapolation from in-situ investigated and quantified SGD (discrete information) to regional scales or beyond? Admittedly, the sketched future is ambitious and we will certainly not be able to present a solution to the raised question. Nonetheless, we will show a remote sensing based approach that is already able to identify potential SGD sites independent on location or hydrogeological conditions. Based on multi-temporal thermal information of the water surface as core of the approach, SGD influenced sites display a smaller thermal variation (thermal anomalies) than surrounding uninfluenced areas. Despite the apparent simplicity, the automatized approach has helped to localize several sites that could be validated with proven in-situ methods. At the same time it embodies the risk to identify false positives that can only be avoided if we can `calibrate' the so obtained thermal anomalies to in-situ data. We will present all pros and cons of our approach with the intention to contribute to the solution of translating SGD investigation to larger scales.

  18. Complex and dynamic landscape of RNA polyadenylation revealed by PAS-Seq

    PubMed Central

    Shepard, Peter J.; Choi, Eun-A; Lu, Jente; Flanagan, Lisa A.; Hertel, Klemens J.; Shi, Yongsheng

    2011-01-01

    Alternative polyadenylation (APA) of mRNAs has emerged as an important mechanism for post-transcriptional gene regulation in higher eukaryotes. Although microarrays have recently been used to characterize APA globally, they have a number of serious limitations that prevents comprehensive and highly quantitative analysis. To better characterize APA and its regulation, we have developed a deep sequencing-based method called Poly(A) Site Sequencing (PAS-Seq) for quantitatively profiling RNA polyadenylation at the transcriptome level. PAS-Seq not only accurately and comprehensively identifies poly(A) junctions in mRNAs and noncoding RNAs, but also provides quantitative information on the relative abundance of polyadenylated RNAs. PAS-Seq analyses of human and mouse transcriptomes showed that 40%–50% of all expressed genes produce alternatively polyadenylated mRNAs. Furthermore, our study detected evolutionarily conserved polyadenylation of histone mRNAs and revealed novel features of mitochondrial RNA polyadenylation. Finally, PAS-Seq analyses of mouse embryonic stem (ES) cells, neural stem/progenitor (NSP) cells, and neurons not only identified more poly(A) sites than what was found in the entire mouse EST database, but also detected significant changes in the global APA profile that lead to lengthening of 3′ untranslated regions (UTR) in many mRNAs during stem cell differentiation. Together, our PAS-Seq analyses revealed a complex landscape of RNA polyadenylation in mammalian cells and the dynamic regulation of APA during stem cell differentiation. PMID:21343387

  19. An index to assess the health and benefits of the global ocean.

    PubMed

    Halpern, Benjamin S; Longo, Catherine; Hardy, Darren; McLeod, Karen L; Samhouri, Jameal F; Katona, Steven K; Kleisner, Kristin; Lester, Sarah E; O'Leary, Jennifer; Ranelletti, Marla; Rosenberg, Andrew A; Scarborough, Courtney; Selig, Elizabeth R; Best, Benjamin D; Brumbaugh, Daniel R; Chapin, F Stuart; Crowder, Larry B; Daly, Kendra L; Doney, Scott C; Elfes, Cristiane; Fogarty, Michael J; Gaines, Steven D; Jacobsen, Kelsey I; Karrer, Leah Bunce; Leslie, Heather M; Neeley, Elizabeth; Pauly, Daniel; Polasky, Stephen; Ris, Bud; St Martin, Kevin; Stone, Gregory S; Sumaila, U Rashid; Zeller, Dirk

    2012-08-30

    The ocean plays a critical role in supporting human well-being, from providing food, livelihoods and recreational opportunities to regulating the global climate. Sustainable management aimed at maintaining the flow of a broad range of benefits from the ocean requires a comprehensive and quantitative method to measure and monitor the health of coupled human–ocean systems. We created an index comprising ten diverse public goals for a healthy coupled human–ocean system and calculated the index for every coastal country. Globally, the overall index score was 60 out of 100 (range 36–86), with developed countries generally performing better than developing countries, but with notable exceptions. Only 5% of countries scored higher than 70, whereas 32% scored lower than 50. The index provides a powerful tool to raise public awareness, direct resource management, improve policy and prioritize scientific research.

  20. Global estimates of shark catches using trade records from commercial markets.

    PubMed

    Clarke, Shelley C; McAllister, Murdoch K; Milner-Gulland, E J; Kirkwood, G P; Michielsens, Catherine G J; Agnew, David J; Pikitch, Ellen K; Nakano, Hideki; Shivji, Mahmood S

    2006-10-01

    Despite growing concerns about overexploitation of sharks, lack of accurate, species-specific harvest data often hampers quantitative stock assessment. In such cases, trade studies can provide insights into exploitation unavailable from traditional monitoring. We applied Bayesian statistical methods to trade data in combination with genetic identification to estimate by species, the annual number of globally traded shark fins, the most commercially valuable product from a group of species often unrecorded in harvest statistics. Our results provide the first fishery-independent estimate of the scale of shark catches worldwide and indicate that shark biomass in the fin trade is three to four times higher than shark catch figures reported in the only global data base. Comparison of our estimates to approximated stock assessment reference points for one of the most commonly traded species, blue shark, suggests that current trade volumes in numbers of sharks are close to or possibly exceeding the maximum sustainable yield levels.

  1. Feature tracking CMR reveals abnormal strain in preclinical arrhythmogenic right ventricular dysplasia/ cardiomyopathy: a multisoftware feasibility and clinical implementation study.

    PubMed

    Bourfiss, Mimount; Vigneault, Davis M; Aliyari Ghasebeh, Mounes; Murray, Brittney; James, Cynthia A; Tichnell, Crystal; Mohamed Hoesein, Firdaus A; Zimmerman, Stefan L; Kamel, Ihab R; Calkins, Hugh; Tandri, Harikrishna; Velthuis, Birgitta K; Bluemke, David A; Te Riele, Anneline S J M

    2017-09-01

    Regional right ventricular (RV) dysfunction is the hallmark of Arrhythmogenic Right Ventricular Dysplasia/Cardiomyopathy (ARVD/C), but is currently only qualitatively evaluated in the clinical setting. Feature Tracking Cardiovascular Magnetic Resonance (FT-CMR) is a novel quantitative method that uses cine CMR to calculate strain values. However, most prior FT-CMR studies in ARVD/C have focused on global RV strain using different software methods, complicating implementation of FT-CMR in clinical practice. We aimed to assess the clinical value of global and regional strain using FT-CMR in ARVD/C and to determine differences between commercially available FT-CMR software packages. We analyzed cine CMR images of 110 subjects (39 overt ARVD/C [mutation+/phenotype+], 40 preclinical ARVD/C [mutation+/phenotype-] and 31 control) for global and regional (subtricuspid, anterior, apical) RV strain in the horizontal longitudinal axis using four FT-CMR software methods (Multimodality Tissue Tracking, TomTec, Medis and Circle Cardiovascular Imaging). Intersoftware agreement was assessed using Bland Altman plots. For global strain, all methods showed reduced strain in overt ARVD/C patients compared to control subjects (p < 0.041), whereas none distinguished preclinical from control subjects (p > 0.275). For regional strain, overt ARVD/C patients showed reduced strain compared to control subjects in all segments which reached statistical significance in the subtricuspid region for all software methods (p < 0.037), in the anterior wall for two methods (p < 0.005) and in the apex for one method (p = 0.012). Preclinical subjects showed abnormal subtricuspid strain compared to control subjects using one of the software methods (p = 0.009). Agreement between software methods for absolute strain values was low (Intraclass Correlation Coefficient = 0.373). Despite large intersoftware variability of FT-CMR derived strain values, all four software methods distinguished overt ARVD/C patients from control subjects by both global and subtricuspid strain values. In the subtricuspid region, one software package distinguished preclinical from control subjects, suggesting the potential to identify early ARVD/C prior to overt disease expression.

  2. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.

    1989-07-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determinedmore » from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake.« less

  3. Measuring stigma affecting sex workers (SW) and men who have sex with men (MSM): A systematic review.

    PubMed

    Fitzgerald-Husek, Alanna; Van Wert, Michael J; Ewing, Whitney F; Grosso, Ashley L; Holland, Claire E; Katterl, Rachel; Rosman, Lori; Agarwal, Arnav; Baral, Stefan D

    2017-01-01

    Stigma involves discrediting a person or group based on a perceived attribute, behaviour or reputation associated with them. Sex workers (SW) and men who have sex with men (MSM) are key populations who are often at increased risk for the acquisition and transmission of HIV and who are affected by stigma that can negatively impact their health and well-being. Although stigma was included as an indicator in the US National HIV/AIDS Strategic Plan and there have been consultations focused on adding a stigma indicator within PEPFAR and the Global Fund in relation to potentiating HIV risks among key populations, there remains limited consensus on the appropriate measurement of SW- or MSM-associated stigma. Consequently, this systematic review summarizes studies using quantitative, qualitative, or mixed methods approaches to measure stigma affecting sex workers and men who have sex with men. This systematic review included English, French, and Spanish peer-reviewed research of any study design measuring SW- or MSM-associated stigma. Articles were published from January 1, 2004 to March 26, 2014 in PsycINFO, PubMed, EMBASE, CINAHL Plus, Global Health, and World Health Organization Global Health Library Regional Indexes. Of the 541 articles reviewed, the majority measured stigma toward MSM (over 97%), were conducted in North America, used quantitative methods, and focused on internalized stigma. With the inclusion of addressing stigma in several domestic and international HIV strategies, there is a need to ensure the use of validated metrics for stigma. The field to date has completed limited measurement of stigma affecting sex workers, and limited measurement of stigma affecting MSM outside of higher income settings. Moving forward requires a concerted effort integrating validated metrics of stigma into health-related surveys and programs for key populations.

  4. Measuring stigma affecting sex workers (SW) and men who have sex with men (MSM): A systematic review

    PubMed Central

    Fitzgerald-Husek, Alanna; Van Wert, Michael J.; Ewing, Whitney F.; Holland, Claire E.; Katterl, Rachel; Rosman, Lori; Baral, Stefan D.

    2017-01-01

    Background Stigma involves discrediting a person or group based on a perceived attribute, behaviour or reputation associated with them. Sex workers (SW) and men who have sex with men (MSM) are key populations who are often at increased risk for the acquisition and transmission of HIV and who are affected by stigma that can negatively impact their health and well-being. Although stigma was included as an indicator in the US National HIV/AIDS Strategic Plan and there have been consultations focused on adding a stigma indicator within PEPFAR and the Global Fund in relation to potentiating HIV risks among key populations, there remains limited consensus on the appropriate measurement of SW- or MSM-associated stigma. Consequently, this systematic review summarizes studies using quantitative, qualitative, or mixed methods approaches to measure stigma affecting sex workers and men who have sex with men. Methods and findings This systematic review included English, French, and Spanish peer-reviewed research of any study design measuring SW- or MSM-associated stigma. Articles were published from January 1, 2004 to March 26, 2014 in PsycINFO, PubMed, EMBASE, CINAHL Plus, Global Health, and World Health Organization Global Health Library Regional Indexes. Of the 541 articles reviewed, the majority measured stigma toward MSM (over 97%), were conducted in North America, used quantitative methods, and focused on internalized stigma. Conclusions With the inclusion of addressing stigma in several domestic and international HIV strategies, there is a need to ensure the use of validated metrics for stigma. The field to date has completed limited measurement of stigma affecting sex workers, and limited measurement of stigma affecting MSM outside of higher income settings. Moving forward requires a concerted effort integrating validated metrics of stigma into health-related surveys and programs for key populations. PMID:29190642

  5. Introducing global health into the undergraduate medical school curriculum using an e-learning program: a mixed method pilot study.

    PubMed

    Gruner, Douglas; Pottie, Kevin; Archibald, Douglas; Allison, Jill; Sabourin, Vicki; Belcaid, Imane; McCarthy, Anne; Brindamour, Mahli; Augustincic Polec, Lana; Duke, Pauline

    2015-09-02

    Physicians need global health competencies to provide effective care to culturally and linguistically diverse patients. Medical schools are seeking innovative approaches to support global health learning. This pilot study evaluated e-learning versus peer-reviewed articles to improve conceptual knowledge of global health. A mixed methods study using a randomized-controlled trial (RCT) and qualitative inquiry consisting of four post-intervention focus groups. Outcomes included pre/post knowledge quiz and self-assessment measures based on validated tools from a Global Health CanMEDS Competency Model. RCT results were analyzed using SPSS-21 and focus group transcripts coded using NVivo-9 and recoded using thematic analysis. One hundred and sixty-one pre-clerkship medical students from three Canadian medical schools participated in 2012-2013: 59 completed all elements of the RCT, 24 participated in the focus groups. Overall, comparing pre to post results, both groups showed a significant increase in the mean knowledge (quiz) scores and for 5/7 self-assessed competencies (p < 0.05). These quantitative data were triangulated with the focus groups findings that revealed knowledge acquisition with both approaches. There was no statistically significant difference between the two approaches. Participants highlighted their preference for e-learning to introduce new global health knowledge and as a repository of resources. They also mentioned personal interest in global health, online convenience and integration into the curriculum as incentives to complete the e-learning. Beta version e-learning barriers included content overload and technical difficulties. Both the e-learning and the peer reviewed PDF articles improved global health conceptual knowledge. Many students however, preferred e-learning given its interactive, multi-media approach, access to links and reference materials and its capacity to engage and re-engage over long periods of time.

  6. The effectiveness of constructivist science instructional methods on middle school students' student achievement and motivation

    NASA Astrophysics Data System (ADS)

    Brooks, John

    A problem facing science educators is determining the most effective means of science instruction so that students will meet or exceed the new rigorous standards. The theoretical framework for this study was based on reform and research efforts that have informed science teachers that using constructivism is the best method of science instruction. The purpose of this study was to investigate how the constructivist method of science instruction affected student achievement and student motivation in a sixth grade science classroom. The guiding research question involved understanding which method of science instruction would be most effective at improving student achievement in science. Other sub-questions included the factors that contribute to student motivation in science and the method of science instruction students receive that affects motivation to learn science. Quantitative data were collected using a pre-test and post-test single group design. T-test and ANCOVA were used to test quantitative hypotheses. Qualitative data were collected using student reflective journals and classroom discussions. Students' perspectives were transcribed, coded and used to further inform quantitative findings. The findings of this study supported the recommendations made by science reformists that the best method of science instruction was a constructivist method. This study also found that participant comments favored constructivist taught classes. The implications for social change at the local level included potential increases in student achievement in science and possibly increased understanding that can facilitate similar changes at other schools. From a global perspective, constructivist-oriented methods might result in students becoming more interested in majoring in science at the college level and in becoming part of a scientifically literate work force.

  7. A Novel Feature-Tracking Echocardiographic Method for the Quantitation of Regional Myocardial Function

    PubMed Central

    Pirat, Bahar; Khoury, Dirar S.; Hartley, Craig J.; Tiller, Les; Rao, Liyun; Schulz, Daryl G.; Nagueh, Sherif F.; Zoghbi, William A.

    2012-01-01

    Objectives The aim of this study was to validate a novel, angle-independent, feature-tracking method for the echocardiographic quantitation of regional function. Background A new echocardiographic method, Velocity Vector Imaging (VVI) (syngo Velocity Vector Imaging technology, Siemens Medical Solutions, Ultrasound Division, Mountain View, California), has been introduced, based on feature tracking—incorporating speckle and endocardial border tracking, that allows the quantitation of endocardial strain, strain rate (SR), and velocity. Methods Seven dogs were studied during baseline, and various interventions causing alterations in regional function: dobutamine, 5-min coronary occlusion with reperfusion up to 1 h, followed by dobutamine and esmolol infusions. Echocardiographic images were acquired from short- and long-axis views of the left ventricle. Segment-length sonomicrometry crystals were used as the reference method. Results Changes in systolic strain in ischemic segments were tracked well with VVI during the different states of regional function. There was a good correlation between circumferential and longitudinal systolic strain by VVI and sonomicrometry (r = 0.88 and r = 0.83, respectively, p < 0.001). Strain measurements in the nonischemic basal segments also demonstrated a significant correlation between the 2 methods (r = 0.65, p < 0.001). Similarly, a significant relation was observed for circumferential and longitudinal SR between the 2 methods (r = 0.94, p < 0.001 and r = 0.90, p < 0.001, respectively). The endocardial velocity relation to changes in strain by sonomicrometry was weaker owing to significant cardiac translation. Conclusions Velocity Vector Imaging, a new feature-tracking method, can accurately assess regional myocardial function at the endocardial level and is a promising clinical tool for the simultaneous quantification of regional and global myocardial function. PMID:18261685

  8. Patterns and Emerging Trends in Global Ocean Health

    PubMed Central

    Halpern, Benjamin S.; Longo, Catherine; Lowndes, Julia S. Stewart; Best, Benjamin D.; Frazier, Melanie; Katona, Steven K.; Kleisner, Kristin M.; Rosenberg, Andrew A.; Scarborough, Courtney; Selig, Elizabeth R.

    2015-01-01

    International and regional policies aimed at managing ocean ecosystem health need quantitative and comprehensive indices to synthesize information from a variety of sources, consistently measure progress, and communicate with key constituencies and the public. Here we present the second annual global assessment of the Ocean Health Index, reporting current scores and annual changes since 2012, recalculated using updated methods and data based on the best available science, for 221 coastal countries and territories. The Index measures performance of ten societal goals for healthy oceans on a quantitative scale of increasing health from 0 to 100, and combines these scores into a single Index score, for each country and globally. The global Index score improved one point (from 67 to 68), while many country-level Index and goal scores had larger changes. Per-country Index scores ranged from 41–95 and, on average, improved by 0.06 points (range -8 to +12). Globally, average scores increased for individual goals by as much as 6.5 points (coastal economies) and decreased by as much as 1.2 points (natural products). Annual updates of the Index, even when not all input data have been updated, provide valuable information to scientists, policy makers, and resource managers because patterns and trends can emerge from the data that have been updated. Changes of even a few points indicate potential successes (when scores increase) that merit recognition, or concerns (when scores decrease) that may require mitigative action, with changes of more than 10–20 points representing large shifts that deserve greater attention. Goal scores showed remarkably little covariance across regions, indicating low redundancy in the Index, such that each goal delivers information about a different facet of ocean health. Together these scores provide a snapshot of global ocean health and suggest where countries have made progress and where a need for further improvement exists. PMID:25774678

  9. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    ERIC Educational Resources Information Center

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  10. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  11. Quantifying (dis)agreement between direct detection experiments in a halo-independent way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldstein, Brian; Kahlhoefer, Felix, E-mail: brian.feldstein@physics.ox.ac.uk, E-mail: felix.kahlhoefer@physics.ox.ac.uk

    We propose an improved method to study recent and near-future dark matter direct detection experiments with small numbers of observed events. Our method determines in a quantitative and halo-independent way whether the experiments point towards a consistent dark matter signal and identifies the best-fit dark matter parameters. To achieve true halo independence, we apply a recently developed method based on finding the velocity distribution that best describes a given set of data. For a quantitative global analysis we construct a likelihood function suitable for small numbers of events, which allows us to determine the best-fit particle physics properties of darkmore » matter considering all experiments simultaneously. Based on this likelihood function we propose a new test statistic that quantifies how well the proposed model fits the data and how large the tension between different direct detection experiments is. We perform Monte Carlo simulations in order to determine the probability distribution function of this test statistic and to calculate the p-value for both the dark matter hypothesis and the background-only hypothesis.« less

  12. Movement Correction Method for Human Brain PET Images: Application to Quantitative Analysis of Dynamic [18F]-FDDNP Scans

    PubMed Central

    Wardak, Mirwais; Wong, Koon-Pong; Shao, Weber; Dahlbom, Magnus; Kepe, Vladimir; Satyamurthy, Nagichettiar; Small, Gary W.; Barrio, Jorge R.; Huang, Sung-Cheng

    2010-01-01

    Head movement during a PET scan (especially, dynamic scan) can affect both the qualitative and quantitative aspects of an image, making it difficult to accurately interpret the results. The primary objective of this study was to develop a retrospective image-based movement correction (MC) method and evaluate its implementation on dynamic [18F]-FDDNP PET images of cognitively intact controls and patients with Alzheimer’s disease (AD). Methods Dynamic [18F]-FDDNP PET images, used for in vivo imaging of beta-amyloid plaques and neurofibrillary tangles, were obtained from 12 AD and 9 age-matched controls. For each study, a transmission scan was first acquired for attenuation correction. An accurate retrospective MC method that corrected for transmission-emission misalignment as well as emission-emission misalignment was applied to all studies. No restriction was assumed for zero movement between the transmission scan and first emission scan. Logan analysis with cerebellum as the reference region was used to estimate various regional distribution volume ratio (DVR) values in the brain before and after MC. Discriminant analysis was used to build a predictive model for group membership, using data with and without MC. Results MC improved the image quality and quantitative values in [18F]-FDDNP PET images. In this subject population, medial temporal (MTL) did not show a significant difference between controls and AD before MC. However, after MC, significant differences in DVR values were seen in frontal, parietal, posterior cingulate (PCG), MTL, lateral temporal (LTL), and global between the two groups (P < 0.05). In controls and AD, the variability of regional DVR values (as measured by the coefficient of variation) decreased on average by >18% after MC. Mean DVR separation between controls and ADs was higher in frontal, MTL, LTL and global after MC. Group classification by discriminant analysis based on [18F]-FDDNP DVR values was markedly improved after MC. Conclusion The streamlined and easy to use MC method presented in this work significantly improves the image quality and the measured tracer kinetics of [18F]-FDDNP PET images. The proposed MC method has the potential to be applied to PET studies on patients having other disorders (e.g., Down syndrome and Parkinson’s disease) and to brain PET scans with other molecular imaging probes. PMID:20080894

  13. Discovering hidden relationships between renal diseases and regulated genes through 3D network visualizations

    PubMed Central

    2010-01-01

    Background In a recent study, two-dimensional (2D) network layouts were used to visualize and quantitatively analyze the relationship between chronic renal diseases and regulated genes. The results revealed complex relationships between disease type, gene specificity, and gene regulation type, which led to important insights about the underlying biological pathways. Here we describe an attempt to extend our understanding of these complex relationships by reanalyzing the data using three-dimensional (3D) network layouts, displayed through 2D and 3D viewing methods. Findings The 3D network layout (displayed through the 3D viewing method) revealed that genes implicated in many diseases (non-specific genes) tended to be predominantly down-regulated, whereas genes regulated in a few diseases (disease-specific genes) tended to be up-regulated. This new global relationship was quantitatively validated through comparison to 1000 random permutations of networks of the same size and distribution. Our new finding appeared to be the result of using specific features of the 3D viewing method to analyze the 3D renal network. Conclusions The global relationship between gene regulation and gene specificity is the first clue from human studies that there exist common mechanisms across several renal diseases, which suggest hypotheses for the underlying mechanisms. Furthermore, the study suggests hypotheses for why the 3D visualization helped to make salient a new regularity that was difficult to detect in 2D. Future research that tests these hypotheses should enable a more systematic understanding of when and how to use 3D network visualizations to reveal complex regularities in biological networks. PMID:21070623

  14. Qualitative and quantitative comparison of brand name and generic protein pharmaceuticals using isotope tags for relative and absolute quantification and matrix-assisted laser desorption/ionization tandem time-of-flight mass spectrometry.

    PubMed

    Ye, Hongping; Hill, John; Kauffman, John; Han, Xianlin

    2010-05-01

    The capability of iTRAQ (isotope tags for relative and absolute quantification) reagents coupled with matrix-assisted laser desorption/ionization tandem time-of-flight mass spectrometry (MALDI-TOF/TOF-MS) as a qualitative and quantitative technique for the analysis of complicated protein pharmaceutical mixtures was evaluated. Mixtures of Somavert and Miacalcin with a small amount of bovine serum albumin (BSA) as an impurity were analyzed. Both Somavert and Miacalcin were qualitatively identified, and BSA was detected at levels as low as 0.8mol%. Genotropin and Somavert were compared in a single experiment, and all of the distinct amino acid residues from the two proteins were readily identified. Four somatropin drug products (Genotropin, Norditropin, Jintropin, and Omnitrope) were compared using the iTRAQ/MALDI-MS method to determine the similarity between their primary structures and quantify the amount of protein in each product. All four product samples were well labeled and successfully compared when a filtration cleanup step preceded iTRAQ labeling. The quantitative accuracy of the iTRAQ method was evaluated. In all cases, the accuracy of experimentally determined protein ratios was higher than 90%, and the relative standard deviation (RSD) was less than 10%. The iTRAQ and global internal standard technology (GIST) methods were compared, and the iTRAQ method provided both higher sequence coverage and enhanced signal intensity. Published by Elsevier Inc.

  15. Laser-induced breakdown spectroscopy (LIBS) to measure quantitatively soil carbon with emphasis on soil organic carbon. A review.

    PubMed

    Senesi, Giorgio S; Senesi, Nicola

    2016-09-28

    Soil organic carbon (OC) measurement is a crucial factor for quantifying soil C pools and inventories and monitoring the inherent temporal and spatial heterogeneity and changes of soil OC content. These are relevant issues in addressing sustainable management of terrestrial OC aiming to enhance C sequestration in soil, thus mitigating the impact of increasing CO2 concentration in the atmosphere and related effects on global climate change. Nowadays, dry combustion by an elemental analyzer or wet combustion by dichromate oxidation of the soil sample are the most recommended and commonly used methods for quantitative soil OC determination. However, the unanimously recognized uncertainties and limitations of these classical laboursome methods have prompted research efforts focusing on the development and application of more advanced and appealing techniques and methods for the measurement of soil OC in the laboratory and possibly in situ in the field. Among these laser-induced breakdown spectroscopy (LIBS) has raised the highest interest for its unique advantages. After an introduction and a highlight of the LIBS basic principles, instrumentation, methodologies and supporting chemometric methods, the main body of this review provides an historical and critical overview of the developments and results obtained up-to-now by the application of LIBS to the quantitative measurement of soil C and especially OC content. A brief critical summary of LIBS advantages and limitations/drawbacks including some final remarks and future perspectives concludes this review. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Advancing tuberculosis drug regimen development through innovative quantitative translational pharmacology methods and approaches.

    PubMed

    Hanna, Debra; Romero, Klaus; Schito, Marco

    2017-03-01

    The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.

  17. Collection, Measurement and Treatment of Microorganism Using Dielectrophoretic Micro Devices

    NASA Astrophysics Data System (ADS)

    Uchida, Satoshi

    Constant monitoring of manufacturing processes has been essential in food industry because of global expansion of microbial infection. Micro-scale dielectrophoretic method is an attractive technique for direct operation and quantitative detection of bioparticles. The electrical system is capable of rapid and simple treatments corresponding to severe legal control for food safety. In this paper, newly developed techniques are reviewed for bacterial concentration, detection and sterilization using dielectrophoresis in a micro reactor. The perspective to an integrated micro device of those components is also discussed.

  18. Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.

  19. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    DOE PAGES

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.; ...

    2015-07-06

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, inmore » addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.« less

  20. Identification of Salmonella Typhimurium deubiquitinase SseL substrates by immunoaffinity enrichment and quantitative proteomic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakayasu, Ernesto S.; Sydor, Michael A.; Brown, Roslyn N.

    Ubiquitination is a key protein post-translational modification that regulates many important cellular pathways and whose levels are regulated by equilibrium between the activities of ubiquitin ligases and deubiquitinases. Here we present a method to identify specific deubiquitinase substrates based on treatment of cell lysates with recombinant enzymes, immunoaffinity purification and global quantitative proteomic analysis. As model system to identify substrates, we used a virulence-related deubiquitinase secreted by Salmonella enterica serovar Typhimurium into the host cells, SseL. By using this approach two SseL substrates were identified in RAW 264.7 murine macrophage-like cell line, S100A6 and het-erogeneous nuclear ribonuclear protein K, inmore » addition to the previously reported K63-linked ubiquitin chains. These substrates were further validated by a combination of enzymatic and binding assays. Finally, this method can be used for the systematic identification of substrates of deubiquitinases from other organisms and applied to study their functions in physiology and disease.« less

  1. A PCR primer bank for quantitative gene expression analysis.

    PubMed

    Wang, Xiaowei; Seed, Brian

    2003-12-15

    Although gene expression profiling by microarray analysis is a useful tool for assessing global levels of transcriptional activity, variability associated with the data sets usually requires that observed differences be validated by some other method, such as real-time quantitative polymerase chain reaction (real-time PCR). However, non-specific amplification of non-target genes is frequently observed in the latter, confounding the analysis in approximately 40% of real-time PCR attempts when primer-specific labels are not used. Here we present an experimentally validated algorithm for the identification of transcript-specific PCR primers on a genomic scale that can be applied to real-time PCR with sequence-independent detection methods. An online database, PrimerBank, has been created for researchers to retrieve primer information for their genes of interest. PrimerBank currently contains 147 404 primers encompassing most known human and mouse genes. The primer design algorithm has been tested by conventional and real-time PCR for a subset of 112 primer pairs with a success rate of 98.2%.

  2. Graph pyramids for protein function prediction

    PubMed Central

    2015-01-01

    Background Uncovering the hidden organizational characteristics and regularities among biological sequences is the key issue for detailed understanding of an underlying biological phenomenon. Thus pattern recognition from nucleic acid sequences is an important affair for protein function prediction. As proteins from the same family exhibit similar characteristics, homology based approaches predict protein functions via protein classification. But conventional classification approaches mostly rely on the global features by considering only strong protein similarity matches. This leads to significant loss of prediction accuracy. Methods Here we construct the Protein-Protein Similarity (PPS) network, which captures the subtle properties of protein families. The proposed method considers the local as well as the global features, by examining the interactions among 'weakly interacting proteins' in the PPS network and by using hierarchical graph analysis via the graph pyramid. Different underlying properties of the protein families are uncovered by operating the proposed graph based features at various pyramid levels. Results Experimental results on benchmark data sets show that the proposed hierarchical voting algorithm using graph pyramid helps to improve computational efficiency as well the protein classification accuracy. Quantitatively, among 14,086 test sequences, on an average the proposed method misclassified only 21.1 sequences whereas baseline BLAST score based global feature matching method misclassified 362.9 sequences. With each correctly classified test sequence, the fast incremental learning ability of the proposed method further enhances the training model. Thus it has achieved more than 96% protein classification accuracy using only 20% per class training data. PMID:26044522

  3. Quantitative measurement of phosphoproteome response to osmotic stress in arabidopsis based on Library-Assisted eXtracted Ion Chromatogram (LAXIC).

    PubMed

    Xue, Liang; Wang, Pengcheng; Wang, Lianshui; Renzi, Emily; Radivojac, Predrag; Tang, Haixu; Arnold, Randy; Zhu, Jian-Kang; Tao, W Andy

    2013-08-01

    Global phosphorylation changes in plants in response to environmental stress have been relatively poorly characterized to date. Here we introduce a novel mass spectrometry-based label-free quantitation method that facilitates systematic profiling plant phosphoproteome changes with high efficiency and accuracy. This method employs synthetic peptide libraries tailored specifically as internal standards for complex phosphopeptide samples and accordingly, a local normalization algorithm, LAXIC, which calculates phosphopeptide abundance normalized locally with co-eluting library peptides. Normalization was achieved in a small time frame centered to each phosphopeptide to compensate for the diverse ion suppression effect across retention time. The label-free LAXIC method was further treated with a linear regression function to accurately measure phosphoproteome responses to osmotic stress in Arabidopsis. Among 2027 unique phosphopeptides identified and 1850 quantified phosphopeptides in Arabidopsis samples, 468 regulated phosphopeptides representing 497 phosphosites have shown significant changes. Several known and novel components in the abiotic stress pathway were identified, illustrating the capability of this method to identify critical signaling events among dynamic and complex phosphorylation. Further assessment of those regulated proteins may help shed light on phosphorylation response to osmotic stress in plants.

  4. A multi-centre evaluation of eleven clinically feasible brain PET/MRI attenuation correction techniques using a large cohort of patients.

    PubMed

    Ladefoged, Claes N; Law, Ian; Anazodo, Udunna; St Lawrence, Keith; Izquierdo-Garcia, David; Catana, Ciprian; Burgos, Ninon; Cardoso, M Jorge; Ourselin, Sebastien; Hutton, Brian; Mérida, Inés; Costes, Nicolas; Hammers, Alexander; Benoit, Didier; Holm, Søren; Juttukonda, Meher; An, Hongyu; Cabello, Jorge; Lukas, Mathias; Nekolla, Stephan; Ziegler, Sibylle; Fenchel, Matthias; Jakoby, Bjoern; Casey, Michael E; Benzinger, Tammie; Højgaard, Liselotte; Hansen, Adam E; Andersen, Flemming L

    2017-02-15

    To accurately quantify the radioactivity concentration measured by PET, emission data need to be corrected for photon attenuation; however, the MRI signal cannot easily be converted into attenuation values, making attenuation correction (AC) in PET/MRI challenging. In order to further improve the current vendor-implemented MR-AC methods for absolute quantification, a number of prototype methods have been proposed in the literature. These can be categorized into three types: template/atlas-based, segmentation-based, and reconstruction-based. These proposed methods in general demonstrated improvements compared to vendor-implemented AC, and many studies report deviations in PET uptake after AC of only a few percent from a gold standard CT-AC. Using a unified quantitative evaluation with identical metrics, subject cohort, and common CT-based reference, the aims of this study were to evaluate a selection of novel methods proposed in the literature, and identify the ones suitable for clinical use. In total, 11 AC methods were evaluated: two vendor-implemented (MR-AC DIXON and MR-AC UTE ), five based on template/atlas information (MR-AC SEGBONE (Koesters et al., 2016), MR-AC ONTARIO (Anazodo et al., 2014), MR-AC BOSTON (Izquierdo-Garcia et al., 2014), MR-AC UCL (Burgos et al., 2014), and MR-AC MAXPROB (Merida et al., 2015)), one based on simultaneous reconstruction of attenuation and emission (MR-AC MLAA (Benoit et al., 2015)), and three based on image-segmentation (MR-AC MUNICH (Cabello et al., 2015), MR-AC CAR-RiDR (Juttukonda et al., 2015), and MR-AC RESOLUTE (Ladefoged et al., 2015)). We selected 359 subjects who were scanned using one of the following radiotracers: [ 18 F]FDG (210), [ 11 C]PiB (51), and [ 18 F]florbetapir (98). The comparison to AC with a gold standard CT was performed both globally and regionally, with a special focus on robustness and outlier analysis. The average performance in PET tracer uptake was within ±5% of CT for all of the proposed methods, with the average±SD global percentage bias in PET FDG uptake for each method being: MR-AC DIXON (-11.3±3.5)%, MR-AC UTE (-5.7±2.0)%, MR-AC ONTARIO (-4.3±3.6)%, MR-AC MUNICH (3.7±2.1)%, MR-AC MLAA (-1.9±2.6)%, MR-AC SEGBONE (-1.7±3.6)%, MR-AC UCL (0.8±1.2)%, MR-AC CAR-RiDR (-0.4±1.9)%, MR-AC MAXPROB (-0.4±1.6)%, MR-AC BOSTON (-0.3±1.8)%, and MR-AC RESOLUTE (0.3±1.7)%, ordered by average bias. The overall best performing methods (MR-AC BOSTON , MR-AC MAXPROB , MR-AC RESOLUTE and MR-AC UCL , ordered alphabetically) showed regional average errors within ±3% of PET with CT-AC in all regions of the brain with FDG, and the same four methods, as well as MR-AC CAR-RiDR , showed that for 95% of the patients, 95% of brain voxels had an uptake that deviated by less than 15% from the reference. Comparable performance was obtained with PiB and florbetapir. All of the proposed novel methods have an average global performance within likely acceptable limits (±5% of CT-based reference), and the main difference among the methods was found in the robustness, outlier analysis, and clinical feasibility. Overall, the best performing methods were MR-ACBOSTON, MR-ACMAXPROB, MR-ACRESOLUTE and MR-ACUCL, ordered alphabetically. These methods all minimized the number of outliers, standard deviation, and average global and local error. The methods MR-ACMUNICH and MR-ACCAR-RiDR were both within acceptable quantitative limits, so these methods should be considered if processing time is a factor. The method MR-ACSEGBONE also demonstrates promising results, and performs well within the likely acceptable quantitative limits. For clinical routine scans where processing time can be a key factor, this vendor-provided solution currently outperforms most methods. With the performance of the methods presented here, it may be concluded that the challenge of improving the accuracy of MR-AC in adult brains with normal anatomy has been solved to a quantitatively acceptable degree, which is smaller than the quantification reproducibility in PET imaging. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    PubMed

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. A novel feature-tracking echocardiographic method for the quantitation of regional myocardial function: validation in an animal model of ischemia-reperfusion.

    PubMed

    Pirat, Bahar; Khoury, Dirar S; Hartley, Craig J; Tiller, Les; Rao, Liyun; Schulz, Daryl G; Nagueh, Sherif F; Zoghbi, William A

    2008-02-12

    The aim of this study was to validate a novel, angle-independent, feature-tracking method for the echocardiographic quantitation of regional function. A new echocardiographic method, Velocity Vector Imaging (VVI) (syngo Velocity Vector Imaging technology, Siemens Medical Solutions, Ultrasound Division, Mountain View, California), has been introduced, based on feature tracking-incorporating speckle and endocardial border tracking, that allows the quantitation of endocardial strain, strain rate (SR), and velocity. Seven dogs were studied during baseline, and various interventions causing alterations in regional function: dobutamine, 5-min coronary occlusion with reperfusion up to 1 h, followed by dobutamine and esmolol infusions. Echocardiographic images were acquired from short- and long-axis views of the left ventricle. Segment-length sonomicrometry crystals were used as the reference method. Changes in systolic strain in ischemic segments were tracked well with VVI during the different states of regional function. There was a good correlation between circumferential and longitudinal systolic strain by VVI and sonomicrometry (r = 0.88 and r = 0.83, respectively, p < 0.001). Strain measurements in the nonischemic basal segments also demonstrated a significant correlation between the 2 methods (r = 0.65, p < 0.001). Similarly, a significant relation was observed for circumferential and longitudinal SR between the 2 methods (r = 0.94, p < 0.001 and r = 0.90, p < 0.001, respectively). The endocardial velocity relation to changes in strain by sonomicrometry was weaker owing to significant cardiac translation. Velocity Vector Imaging, a new feature-tracking method, can accurately assess regional myocardial function at the endocardial level and is a promising clinical tool for the simultaneous quantification of regional and global myocardial function.

  7. Quantitative measures detect sensory and motor impairments in multiple sclerosis

    PubMed Central

    Newsome, Scott D.; Wang, Joseph I.; Kang, Jonathan Y.; Calabresi, Peter A.; Zackowski, Kathleen M.

    2011-01-01

    Background Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. Objective To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. Methods We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and timed 25-foot walk (T25FW). T-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). Results EDSS scores ranged from 0-7.5, mean disease duration was 10.4±9.6 years, and 66% were female. In RRMS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups’ ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory FSS. Conclusions Sensory and motor deficits can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. PMID:21458828

  8. Integrating global health with medical education.

    PubMed

    Aulakh, Alex; Tweed, Sam; Moore, Jolene; Graham, Wendy

    2017-04-01

    Globalisation has implications for the next generation of doctors, and thus for medical education. Increasingly, global health is being taught in medical schools, although its incorporation into an already full curriculum presents challenges. Global health was introduced into the MBChB curriculum at the University of Aberdeen through a student-selected component (SSC) as part of an existing medical humanities block. The Global Health and Humanities (GHH) module was first delivered in the autumn of 2013 and will shortly enter its third year. This student-led study used quantitative and qualitative methods to assess the module's appropriateness and effectiveness for strengthening learning on global health, consisting of online surveys for course participants and semi-structured interviews with faculty members. Integrating global health into the undergraduate medical curriculum by way of an SSC was regarded by teaching staff as an effective and realistic approach. A recognised strength of delivering global health as part of the medical humanities block was the opportunity to expose students to the social determinants of health through interdisciplinary teaching. Participating students all agreed that the learning approach strengthened both their knowledge of global health and a range of generic skills. SSCs are, by definition, self-selecting, and will have a tendency to attract students already with an interest in a topic - here global health. A wide range of learning opportunities is needed to integrate global health throughout medical curricula, and to reach all students. © 2016 John Wiley & Sons Ltd.

  9. Social change, globalization and transcultural psychiatry--some considerations from a study on women and depression.

    PubMed

    Dech, Heike; Ndetei, David M; Machleidt, Wieland

    2003-01-01

    Transcultural psychiatry, whose scientific founder Emil Kraepelin is considered as, in its 100 years of tradition has not only developed a varied range of methods but has also brought about a change in the respective scientific questions as well as in related research and clinical applications. Whereas transcultural research on the psychopathology of depression contributed to the further development of psychiatric nosology, transcultural psychiatry has recently been increasingly faced with issues concerning phenomena of social change and globalization. One region, where such conditions can be observed in particular is Africa, where the dissolving of traditional standards and support systems and growing economic insecurity causes a considerable burden especially on women. As an example, results from a cross-sectional study on East African women using a two step design as well as qualitative and quantitative, standardized psychiatric methods are discussed concerning the association of social change, psycho-social risk factors and the development of depressive disorders. Efficient clinical methods towards diagnosis and treatment of new risk groups will have to be developed, of which an important aspect will be crisis intervention.

  10. Late Eocene to early Oligocene quantitative paleotemperature record: Evidence from continental halite fluid inclusions

    PubMed Central

    Zhao, Yan-jun; Zhang, Hua; Liu, Cheng-lin; Liu, Bao-kun; Ma, Li-chun; Wang, Li-cheng

    2014-01-01

    Climate changes within Cenozoic extreme climate events such as the Paleocene–Eocene Thermal Maximum and the First Oligocene Glacial provide good opportunities to estimate the global climate trends in our present and future life. However, quantitative paleotemperatures data for Cenozoic climatic reconstruction are still lacking, hindering a better understanding of the past and future climate conditions. In this contribution, quantitative paleotemperatures were determined by fluid inclusion homogenization temperature (Th) data from continental halite of the first member of the Shahejie Formation (SF1; probably late Eocene to early Oligocene) in Bohai Bay Basin, North China. The primary textures of the SF1 halite typified by cumulate and chevron halite suggest halite deposited in a shallow saline water and halite Th can serve as an temperature proxy. In total, one-hundred-twenty-one Th data from primary and single-phase aqueous fluid inclusions with different depths were acquired by the cooling nucleation method. The results show that all Th range from 17.7°C to 50.7°C,with the maximum homogenization temperatures (ThMAX) of 50.5°C at the depth of 3028.04 m and 50.7°C at 3188.61 m, respectively. Both the ThMAX presented here are significantly higher than the highest temperature recorded in this region since 1954and agree with global temperature models for the year 2100 predicted by the Intergovernmental Panel on Climate Change. PMID:25047483

  11. The place of words and numbers in psychiatric research.

    PubMed

    Falissard, Bruno; Révah, Anne; Yang, Suzanne; Fagot-Largeault, Anne

    2013-11-18

    In recent decades, there has been widespread debate in the human and social sciences regarding the compatibility and the relative merits of quantitative and qualitative approaches in research. In psychiatry, depending on disciplines and traditions, objects of study can be represented either in words or using two types of mathematization. In the latter case, the use of mathematics in psychiatry is most often only local, as opposed to global as in the case of classical mechanics. Relationships between these objects of study can in turn be explored in three different ways: 1/ by a hermeneutic process, 2/ using statistics, the most frequent method in psychiatric research today, 3/ using equations, i.e. using mathematical relationships that are formal and deterministic. The 3 ways of representing entities (with language, locally with mathematics or globally with mathematics) and the 3 ways of expressing the relationships between entities (using hermeneutics, statistics or equations) can be combined in a cross-tabulation, and nearly all nine combinations can be described using examples. A typology of this nature may be useful in assessing which epistemological perspectives are currently dominant in a constantly evolving field such as psychiatry, and which other perspectives still need to be developed. It also contributes to undermining the overly simplistic and counterproductive beliefs that accompany the assumption of a Manichean "quantitative/qualitative" dichotomy. Systematic examination of this set of typologies could be useful in indicating new directions for future research beyond the quantitative/qualitative divide.

  12. The place of words and numbers in psychiatric research

    PubMed Central

    2013-01-01

    In recent decades, there has been widespread debate in the human and social sciences regarding the compatibility and the relative merits of quantitative and qualitative approaches in research. In psychiatry, depending on disciplines and traditions, objects of study can be represented either in words or using two types of mathematization. In the latter case, the use of mathematics in psychiatry is most often only local, as opposed to global as in the case of classical mechanics. Relationships between these objects of study can in turn be explored in three different ways: 1/ by a hermeneutic process, 2/ using statistics, the most frequent method in psychiatric research today, 3/ using equations, i.e. using mathematical relationships that are formal and deterministic. The 3 ways of representing entities (with language, locally with mathematics or globally with mathematics) and the 3 ways of expressing the relationships between entities (using hermeneutics, statistics or equations) can be combined in a cross-tabulation, and nearly all nine combinations can be described using examples. A typology of this nature may be useful in assessing which epistemological perspectives are currently dominant in a constantly evolving field such as psychiatry, and which other perspectives still need to be developed. It also contributes to undermining the overly simplistic and counterproductive beliefs that accompany the assumption of a Manichean “quantitative/qualitative” dichotomy. Systematic examination of this set of typologies could be useful in indicating new directions for future research beyond the quantitative/qualitative divide. PMID:24246064

  13. Building global models for fat and total protein content in raw milk based on historical spectroscopic data in the visible and short-wave near infrared range.

    PubMed

    Melenteva, Anastasiia; Galyanin, Vladislav; Savenkova, Elena; Bogomolov, Andrey

    2016-07-15

    A large set of fresh cow milk samples collected from many suppliers over a large geographical area in Russia during a year has been analyzed by optical spectroscopy in the range 400-1100 nm in accordance with previously developed scatter-based technique. The global (i.e. resistant to seasonal, genetic, regional and other variations of the milk composition) models for fat and total protein content, which were built using partial least-squares (PLS) regression, exhibit satisfactory prediction performances enabling their practical application in the dairy. The root mean-square errors of prediction (RMSEP) were 0.09 and 0.10 for fat and total protein content, respectively. The issues of raw milk analysis and multivariate modelling based on the historical spectroscopic data have been considered and approaches to the creation of global models and their transfer between the instruments have been proposed. Availability of global models should significantly facilitate the dissemination of optical spectroscopic methods for the laboratory and in-line quantitative milk analysis. Copyright © 2016. Published by Elsevier Ltd.

  14. Gender counts: A systematic review of evaluations of gender-integrated health interventions in low- and middle-income countries.

    PubMed

    Schriver, Brittany; Mandal, Mahua; Muralidharan, Arundati; Nwosu, Anthony; Dayal, Radhika; Das, Madhumita; Fehringer, Jessica

    2017-11-01

    As a result of new global priorities, there is a growing need for high-quality evaluations of gender-integrated health programmes. This systematic review examined 99 peer-reviewed articles on evaluations of gender-integrated (accommodating and transformative) health programmes with regard to their theory of change (ToC), study design, gender integration in data collection, analysis, and gender measures used. Half of the evaluations explicitly described a ToC or conceptual framework (n = 50) that guided strategies for their interventions. Over half (61%) of the evaluations used quantitative methods exclusively; 11% used qualitative methods exclusively; and 28% used mixed methods. Qualitative methods were not commonly detailed. Evaluations of transformative interventions were less likely than those of accommodating interventions to employ randomised control trials. Two-thirds of the reviewed evaluations reported including at least one specific gender-related outcome (n = 18 accommodating, n = 44 transformative). To strengthen evaluations of gender-integrated programmes, we recommend use of ToCs, explicitly including gender in the ToC, use of gender-sensitive measures, mixed-method designs, in-depth descriptions of qualitative methods, and attention to gender-related factors in data collection logistics. We also recommend further research to develop valid and reliable gender measures that are globally relevant.

  15. On determining the point of no return in climate change

    NASA Astrophysics Data System (ADS)

    van Zalinge, Brenda C.; Feng, Qing Yi; Aengenheyster, Matthias; Dijkstra, Henk A.

    2017-08-01

    Earth's global mean surface temperature has increased by about 1.0 °C over the period 1880-2015. One of the main causes is thought to be the increase in atmospheric greenhouse gases. If greenhouse gas emissions are not substantially decreased, several studies indicate that there will be a dangerous anthropogenic interference with climate by the end of this century. However, there is no good quantitative measure to determine when it is too late to start reducing greenhouse gas emissions in order to avoid such dangerous interference. In this study, we develop a method for determining a so-called point of no return for several greenhouse gas emission scenarios. The method is based on a combination of aspects of stochastic viability theory and linear response theory; the latter is used to estimate the probability density function of the global mean surface temperature. The innovative element in this approach is the applicability to high-dimensional climate models as demonstrated by the results obtained with the PlaSim model.

  16. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    NASA Astrophysics Data System (ADS)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  17. 3-D DNA methylation phenotypes correlate with cytotoxicity levels in prostate and liver cancer cell models

    PubMed Central

    2013-01-01

    Background The spatial organization of the genome is being evaluated as a novel indicator of toxicity in conjunction with drug-induced global DNA hypomethylation and concurrent chromatin reorganization. 3D quantitative DNA methylation imaging (3D-qDMI) was applied as a cell-by-cell high-throughput approach to investigate this matter by assessing genome topology through represented immunofluorescent nuclear distribution patterns of 5-methylcytosine (MeC) and global DNA (4,6-diamidino-2-phenylindole = DAPI) in labeled nuclei. Methods Differential progression of global DNA hypomethylation was studied by comparatively dosing zebularine (ZEB) and 5-azacytidine (AZA). Treated and untreated (control) human prostate and liver cancer cells were subjected to confocal scanning microscopy and dedicated 3D image analysis for the following features: differential nuclear MeC/DAPI load and codistribution patterns, cell similarity based on these patterns, and corresponding differences in the topology of low-intensity MeC (LIM) and low in intensity DAPI (LID) sites. Results Both agents generated a high fraction of similar MeC phenotypes across applied concentrations. ZEB exerted similar effects at 10–100-fold higher drug concentrations than its AZA analogue: concentration-dependent progression of global cytosine demethylation, validated by measuring differential MeC levels in repeat sequences using MethyLight, and the concurrent increase in nuclear LIM densities correlated with cellular growth reduction and cytotoxicity. Conclusions 3D-qDMI demonstrated the capability of quantitating dose-dependent drug-induced spatial progression of DNA demethylation in cell nuclei, independent from interphase cell-cycle stages and in conjunction with cytotoxicity. The results support the notion of DNA methylation topology being considered as a potential indicator of causal impacts on chromatin distribution with a conceivable application in epigenetic drug toxicology. PMID:23394161

  18. Challenging the in-vivo assessment of biomechanical properties of the uterine cervix: A critical analysis of ultrasound based quasi-static procedures.

    PubMed

    Maurer, M M; Badir, S; Pensalfini, M; Bajka, M; Abitabile, P; Zimmermann, R; Mazza, E

    2015-06-25

    Measuring the stiffness of the uterine cervix might be useful in the prediction of preterm delivery, a still unsolved health issue of global dimensions. Recently, a number of clinical studies have addressed this topic, proposing quantitative methods for the assessment of the mechanical properties of the cervix. Quasi-static elastography, maximum compressibility using ultrasound and aspiration tests have been applied for this purpose. The results obtained with the different methods seem to provide contradictory information about the physiologic development of cervical stiffness during pregnancy. Simulations and experiments were performed in order to rationalize the findings obtained with ultrasound based, quasi-static procedures. The experimental and computational results clearly illustrate that standardization of quasi-static elastography leads to repeatable strain values, but for different loading forces. Since force cannot be controlled, this current approach does not allow the distinction between a globally soft and stiff cervix. It is further shown that introducing a reference elastomer into the elastography measurement might overcome the problem of force standardization, but a careful mechanical analysis is required to obtain reliable stiffness values for cervical tissue. In contrast, the maximum compressibility procedure leads to a repeatable, semi-quantitative assessment of cervical consistency, due to the nonlinear nature of the mechanical behavior of cervical tissue. The evolution of cervical stiffness in pregnancy obtained with this procedure is in line with data from aspiration tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Reflection imaging of China ink-perfused brain vasculature using confocal laser-scanning microscopy after clarification of brain tissue by the Spalteholz method.

    PubMed

    Gutierre, R C; Vannucci Campos, D; Mortara, R A; Coppi, A A; Arida, R M

    2017-04-01

    Confocal laser-scanning microscopy is a useful tool for visualizing neurons and glia in transparent preparations of brain tissue from laboratory animals. Currently, imaging capillaries and venules in transparent brain tissues requires the use of fluorescent proteins. Here, we show that vessels can be imaged by confocal laser-scanning microscopy in transparent cortical, hippocampal and cerebellar preparations after clarification of China ink-injected specimens by the Spalteholz method. This method may be suitable for global, three-dimensional, quantitative analyses of vessels, including stereological estimations of total volume and length and of surface area of vessels, which constitute indirect approaches to investigate angiogenesis. © 2017 Anatomical Society.

  20. Global-Mindedness and Intercultural Competence: A Quantitative Study of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Cui, Qi

    2013-01-01

    This study assessed pre-service teachers' levels of global-mindedness and intercultural competence using the Global-Mindedness Scale (GMS) and the Cultural Intelligence Scale (CQS) and investigated the correlation between the two. The study examined whether the individual scale factors such as gender, perceived competence in non-native language or…

  1. Non-biased and efficient global amplification of a single-cell cDNA library

    PubMed Central

    Huang, Huan; Goto, Mari; Tsunoda, Hiroyuki; Sun, Lizhou; Taniguchi, Kiyomi; Matsunaga, Hiroko; Kambara, Hideki

    2014-01-01

    Analysis of single-cell gene expression promises a more precise understanding of molecular mechanisms of a living system. Most techniques only allow studies of the expressions for limited numbers of gene species. When amplification of cDNA was carried out for analysing more genes, amplification biases were frequently reported. A non-biased and efficient global-amplification method, which uses a single-cell cDNA library immobilized on beads, was developed for analysing entire gene expressions for single cells. Every step in this analysis from reverse transcription to cDNA amplification was optimized. By removing degrading excess primers, the bias due to the digestion of cDNA was prevented. Since the residual reagents, which affect the efficiency of each subsequent reaction, could be removed by washing beads, the conditions for uniform and maximized amplification of cDNAs were achieved. The differences in the amplification rates for randomly selected eight genes were within 1.5-folds, which could be negligible for most of the applications of single-cell analysis. The global amplification gives a large amount of amplified cDNA (>100 μg) from a single cell (2-pg mRNA), and that amount is enough for downstream analysis. The proposed global-amplification method was used to analyse transcript ratios of multiple cDNA targets (from several copies to several thousand copies) quantitatively. PMID:24141095

  2. Climate change and dengue: a critical and systematic review of quantitative modelling approaches

    PubMed Central

    2014-01-01

    Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. PMID:24669859

  3. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  4. An Integrated Strategy for Global Qualitative and Quantitative Profiling of Traditional Chinese Medicine Formulas: Baoyuan Decoction as a Case

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2016-12-01

    Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.

  5. QuantFusion: Novel Unified Methodology for Enhanced Coverage and Precision in Quantifying Global Proteomic Changes in Whole Tissues.

    PubMed

    Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian

    2016-02-01

    Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  6. Graph pyramids for protein function prediction.

    PubMed

    Sandhan, Tushar; Yoo, Youngjun; Choi, Jin; Kim, Sun

    2015-01-01

    Uncovering the hidden organizational characteristics and regularities among biological sequences is the key issue for detailed understanding of an underlying biological phenomenon. Thus pattern recognition from nucleic acid sequences is an important affair for protein function prediction. As proteins from the same family exhibit similar characteristics, homology based approaches predict protein functions via protein classification. But conventional classification approaches mostly rely on the global features by considering only strong protein similarity matches. This leads to significant loss of prediction accuracy. Here we construct the Protein-Protein Similarity (PPS) network, which captures the subtle properties of protein families. The proposed method considers the local as well as the global features, by examining the interactions among 'weakly interacting proteins' in the PPS network and by using hierarchical graph analysis via the graph pyramid. Different underlying properties of the protein families are uncovered by operating the proposed graph based features at various pyramid levels. Experimental results on benchmark data sets show that the proposed hierarchical voting algorithm using graph pyramid helps to improve computational efficiency as well the protein classification accuracy. Quantitatively, among 14,086 test sequences, on an average the proposed method misclassified only 21.1 sequences whereas baseline BLAST score based global feature matching method misclassified 362.9 sequences. With each correctly classified test sequence, the fast incremental learning ability of the proposed method further enhances the training model. Thus it has achieved more than 96% protein classification accuracy using only 20% per class training data.

  7. Evaluation of global and regional right ventricular systolic function in patients with pulmonary hypertension using a novel speckle tracking method.

    PubMed

    Pirat, Bahar; McCulloch, Marti L; Zoghbi, William A

    2006-09-01

    This study sought to demonstrate that a novel speckle-tracking method can be used to assess right ventricular (RV) global and regional systolic function. Fifty-eight patients with pulmonary arterial hypertension (11 men; mean age 53 +/- 14 years) and 19 age-matched controls were studied. Echocardiographic images in apical planes were analyzed by conventional manual tracing for volumes and ejection fractions and by novel software (Axius Velocity Vector Imaging). Myocardial velocity, strain rate, and strain were determined at the basal, mid, and apical segments of the RV free wall and ventricular septum by Velocity Vector Imaging. RV volumes and ejection fractions obtained with manual tracing correlated strongly with the same indexes obtained by the Velocity Vector Imaging method in all subjects (r = 0.95 to 0.98, p < 0.001 for all). Peak systolic myocardial velocities, strain rate, and strain were significantly impaired in patients with pulmonary arterial hypertension compared with controls and were most altered in patients with the most severe pulmonary arterial hypertension (p < 0.05 for all). Pulmonary artery systolic pressure and a Doppler index of pulmonary vascular resistance were independent predictors of RV strain (r = -0.61 and r = -0.65, respectively, p < 0.05 for both). In conclusion, the new automated Velocity Vector Imaging method provides simultaneous quantitation of global and regional RV function that is angle independent and can be applied retrospectively to already stored digital images.

  8. Using GPS To Teach More Than Accurate Positions.

    ERIC Educational Resources Information Center

    Johnson, Marie C.; Guth, Peter L.

    2002-01-01

    Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…

  9. Seasonal changes of organic matter quality and quantity at the outlet of a forested karst system (La Roche Saint Alban, French Alps)

    NASA Astrophysics Data System (ADS)

    Tissier, Grégory; Perrette, Yves; Dzikowski, Marc; Poulenard, Jérome; Hobléa, Fabien; Malet, Emmanuel; Fanget, Bernard

    2013-03-01

    SummaryBecause of its impact on water quality, organic matter (OM) in karst groundwater has been widely studied. The present article describes a method for monitoring OM in karst aquifers characterized by quick responses to rainfall. This method combines weekly manual sampling and continuous monitoring to provide, qualitative and quantitative information about OM flow. Weekly samples were analyzed for Total Organic Carbon (TOC) content and spectrofluorescence, while continuous monitoring was carried out at the main spring, using a field fluorimeter (310/400-700 nm and 280/300-600 nm) to quantify chromophoric organic matter (COM). The type and quantity of COM were defined by decomposing Excitation Emission Matrices (EEMs) and by applying a 2D fluorescence decomposition method. Continuous monitoring data showed that the dominant COM was humic-like (HL). We found three types of relationship between HL and discharge and between HL and TOC, showing that caution must be exercised when using field fluorimeter measurements to quantify TOC. Each relationship was characterized by global differences in OM content and by the presence of different percentages of non-chromophoric organic matter. These three relationships are associated with changes in hydrology and microorganism activity during the year. We used these relationships to estimate the annual OM flow (about 15 kg/ha/year) and thereby quantify OM flow during the year. Our results show the importance of the non-chromophoric organic matter in such estimation. That work illustrates the need to couple qualitative and quantitative monitoring of OM in karst spring to improve the global comprehension of karst system and of the sources implies in the OM flow.

  10. A 3D global-to-local deformable mesh model based registration and anatomy-constrained segmentation method for image guided prostate radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Jinghao; Kim, Sung; Jabbour, Salma

    2010-03-15

    Purpose: In the external beam radiation treatment of prostate cancers, successful implementation of adaptive radiotherapy and conformal radiation dose delivery is highly dependent on precise and expeditious segmentation and registration of the prostate volume between the simulation and the treatment images. The purpose of this study is to develop a novel, fast, and accurate segmentation and registration method to increase the computational efficiency to meet the restricted clinical treatment time requirement in image guided radiotherapy. Methods: The method developed in this study used soft tissues to capture the transformation between the 3D planning CT (pCT) images and 3D cone-beam CTmore » (CBCT) treatment images. The method incorporated a global-to-local deformable mesh model based registration framework as well as an automatic anatomy-constrained robust active shape model (ACRASM) based segmentation algorithm in the 3D CBCT images. The global registration was based on the mutual information method, and the local registration was to minimize the Euclidian distance of the corresponding nodal points from the global transformation of deformable mesh models, which implicitly used the information of the segmented target volume. The method was applied on six data sets of prostate cancer patients. Target volumes delineated by the same radiation oncologist on the pCT and CBCT were chosen as the benchmarks and were compared to the segmented and registered results. The distance-based and the volume-based estimators were used to quantitatively evaluate the results of segmentation and registration. Results: The ACRASM segmentation algorithm was compared to the original active shape model (ASM) algorithm by evaluating the values of the distance-based estimators. With respect to the corresponding benchmarks, the mean distance ranged from -0.85 to 0.84 mm for ACRASM and from -1.44 to 1.17 mm for ASM. The mean absolute distance ranged from 1.77 to 3.07 mm for ACRASM and from 2.45 to 6.54 mm for ASM. The volume overlap ratio ranged from 79% to 91% for ACRASM and from 44% to 80% for ASM. These data demonstrated that the segmentation results of ACRASM were in better agreement with the corresponding benchmarks than those of ASM. The developed registration algorithm was quantitatively evaluated by comparing the registered target volumes from the pCT to the benchmarks on the CBCT. The mean distance and the root mean square error ranged from 0.38 to 2.2 mm and from 0.45 to 2.36 mm, respectively, between the CBCT images and the registered pCT. The mean overlap ratio of the prostate volumes ranged from 85.2% to 95% after registration. The average time of the ACRASM-based segmentation was under 1 min. The average time of the global transformation was from 2 to 4 min on two 3D volumes and the average time of the local transformation was from 20 to 34 s on two deformable superquadrics mesh models. Conclusions: A novel and fast segmentation and deformable registration method was developed to capture the transformation between the planning and treatment images for external beam radiotherapy of prostate cancers. This method increases the computational efficiency and may provide foundation to achieve real time adaptive radiotherapy.« less

  11. Method for local temperature measurement in a nanoreactor for in situ high-resolution electron microscopy.

    PubMed

    Vendelbo, S B; Kooyman, P J; Creemer, J F; Morana, B; Mele, L; Dona, P; Nelissen, B J; Helveg, S

    2013-10-01

    In situ high-resolution transmission electron microscopy (TEM) of solids under reactive gas conditions can be facilitated by microelectromechanical system devices called nanoreactors. These nanoreactors are windowed cells containing nanoliter volumes of gas at ambient pressures and elevated temperatures. However, due to the high spatial confinement of the reaction environment, traditional methods for measuring process parameters, such as the local temperature, are difficult to apply. To address this issue, we devise an electron energy loss spectroscopy (EELS) method that probes the local temperature of the reaction volume under inspection by the electron beam. The local gas density, as measured using quantitative EELS, is combined with the inherent relation between gas density and temperature, as described by the ideal gas law, to obtain the local temperature. Using this method we determined the temperature gradient in a nanoreactor in situ, while the average, global temperature was monitored by a traditional measurement of the electrical resistivity of the heater. The local gas temperatures had a maximum of 56 °C deviation from the global heater values under the applied conditions. The local temperatures, obtained with the proposed method, are in good agreement with predictions from an analytical model. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Metaphors of Primary School Students Relating to the Concept of Global Warming

    ERIC Educational Resources Information Center

    Dogru, Mustafa; Sarac, Esra

    2013-01-01

    The purpose of this study is to reveal the metaphors of primary school students (n = 362) relating to the concept of global warming. Data collected by completing the expression of "global warming is like..., because..." of the students were analysed by use of qualitative and quantitative data analysis techniques. According to findings of…

  13. A comparative analysis of human plasma and serum proteins by combining native PAGE, whole-gel slicing and quantitative LC-MS/MS: Utilizing native MS-electropherograms in proteomic analysis for discovering structure and interaction-correlated differences.

    PubMed

    Wen, Meiling; Jin, Ya; Manabe, Takashi; Chen, Shumin; Tan, Wen

    2017-12-01

    MS identification has long been used for PAGE-separated protein bands, but global and systematic quantitation utilizing MS after PAGE has remained rare and not been reported for native PAGE. Here we reported on a new method combining native PAGE, whole-gel slicing and quantitative LC-MS/MS, aiming at comparative analysis on not only abundance, but also structures and interactions of proteins. A pair of human plasma and serum samples were used as test samples and separated on a native PAGE gel. Six lanes of each sample were cut, each lane was further sliced into thirty-five 1.1 mm × 1.1 mm squares and all the squares were subjected to standardized procedures of in-gel digestion and quantitative LC-MS/MS. The results comprised 958 data rows that each contained abundance values of a protein detected in one square in eleven gel lanes (one plasma lane excluded). The data were evaluated to have satisfactory reproducibility of assignment and quantitation. Totally 315 proteins were assigned, with each protein assigned in 1-28 squares. The abundance distributions in the plasma and serum gel lanes were reconstructed for each protein, named as "native MS-electropherograms". Comparison of the electropherograms revealed significant plasma-versus-serum differences on 33 proteins in 87 squares (fold difference > 2 or < 0.5, p < 0.05). Many of the differences matched with accumulated knowledge on protein interactions and proteolysis involved in blood coagulation, complement and wound healing processes. We expect this method would be useful to provide more comprehensive information in comparative proteomic analysis, on both quantities and structures/interactions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Verification of watershed vegetation restoration policies, arid China

    PubMed Central

    Zhang, Chengqi; Li, Yu

    2016-01-01

    Verification of restoration policies that have been implemented is of significance to simultaneously reduce global environmental risks while also meeting economic development goals. This paper proposed a novel method according to the idea of multiple time scales to verify ecological restoration policies in the Shiyang River drainage basin, arid China. We integrated modern pollen transport characteristics of the entire basin and pollen records from 8 Holocene sedimentary sections, and quantitatively reconstructed the millennial-scale changes of watershed vegetation zones by defining a new pollen-precipitation index. Meanwhile, Empirical Orthogonal Function method was used to quantitatively analyze spatial and temporal variations of Normalized Difference Vegetation Index in summer (June to August) of 2000–2014. By contrasting the vegetation changes that mainly controlled by millennial-scale natural ecological evolution with that under conditions of modern ecological restoration measures, we found that vegetation changes of the entire Shiyang River drainage basin are synchronous in both two time scales, and the current ecological restoration policies met the requirements of long-term restoration objectives and showed promising early results on ecological environmental restoration. Our findings present an innovative method to verify river ecological restoration policies, and also provide the scientific basis to propose future emphasizes of ecological restoration strategies. PMID:27470948

  15. Verification of watershed vegetation restoration policies, arid China

    NASA Astrophysics Data System (ADS)

    Zhang, Chengqi; Li, Yu

    2016-07-01

    Verification of restoration policies that have been implemented is of significance to simultaneously reduce global environmental risks while also meeting economic development goals. This paper proposed a novel method according to the idea of multiple time scales to verify ecological restoration policies in the Shiyang River drainage basin, arid China. We integrated modern pollen transport characteristics of the entire basin and pollen records from 8 Holocene sedimentary sections, and quantitatively reconstructed the millennial-scale changes of watershed vegetation zones by defining a new pollen-precipitation index. Meanwhile, Empirical Orthogonal Function method was used to quantitatively analyze spatial and temporal variations of Normalized Difference Vegetation Index in summer (June to August) of 2000-2014. By contrasting the vegetation changes that mainly controlled by millennial-scale natural ecological evolution with that under conditions of modern ecological restoration measures, we found that vegetation changes of the entire Shiyang River drainage basin are synchronous in both two time scales, and the current ecological restoration policies met the requirements of long-term restoration objectives and showed promising early results on ecological environmental restoration. Our findings present an innovative method to verify river ecological restoration policies, and also provide the scientific basis to propose future emphasizes of ecological restoration strategies.

  16. Globalization as a Driver or Bottleneck for Sustainable Development: Some Empirical, Cross-National Reflections on Basic Issues of International Health Policy and Management

    PubMed Central

    Tausch, Arno

    2013-01-01

    Background: This article looks at the long-term, structural determinants of environmental and public health performance in the world system. Methods: In multiple standard ordinary least squares (OLS) regression models, we tested the effects of 26 standard predictor variables, including the ‘four freedoms’ of goods, capital, labour and services, on the following indicators of sustainable development and public health: avoiding net trade of ecological footprint global hectare (gha) per person; avoiding high carbon emissions per million US dollars GDP; avoiding high CO2 per capita (gha/cap); avoiding high ecological footprint per capita; avoiding becoming victim of natural disasters; a good performance on the Environmental Performance Index (EPI); a good performance on the Happy Life Years (HLYs) scale; and a good performance on the Happy Planet Index (HPI). Results: Our research showed that the apprehensions of quantitative research, critical of neo-liberal globalization, are fully vindicated by the significant negative environmental and public health effects of the foreign savings rate. High foreign savings are indeed a driver of global footprint, and are a blockade against a satisfactory HPI performance. The new international division of labour is one of the prime drivers of high CO2 per capita emissions. Multinational Corporation (MNC) penetration, the master variable of most quantitative dependency theories, blocks EPI and several other socially important processes. Worker remittances have a significant positive effect on the HPI, and HLYs. Conclusion: We re-analysed the solid macro-political and macro-sociological evidence on a global scale, published in the world’s leading peer-reviewed social science, ecological and public health journals, which seem to indicate that there are contradictions between unfettered globalization and unconstrained world economic openness and sustainable development and public health development. We suggest that there seems to be a strong interaction between ‘transnational capitalist penetration’ and ‘environmental and public health degradation’. Global policy-making finally should dare to take the globalization-critical organizations of ‘civil society’ seriously. This conclusion not only holds for the countries of the developed “West”, but also, increasingly, for the growing democracy and civil society movements around the globe, in countries as diverse as Brazil, Russia, China, or ever larger parts of the Muslim world. PMID:24596855

  17. A Novel Health Evaluation Strategy for Multifunctional Self-Validating Sensors

    PubMed Central

    Shen, Zhengguang; Wang, Qi

    2013-01-01

    The performance evaluation of sensors is very important in actual application. In this paper, a theory based on multi-variable information fusion is studied to evaluate the health level of multifunctional sensors. A novel conception of health reliability degree (HRD) is defined to indicate a quantitative health level, which is different from traditional so-called qualitative fault diagnosis. To evaluate the health condition from both local and global perspectives, the HRD of a single sensitive component at multiple time points and the overall multifunctional sensor at a single time point are defined, respectively. The HRD methodology is emphasized by using multi-variable data fusion technology coupled with a grey comprehensive evaluation method. In this method, to acquire the distinct importance of each sensitive unit and the sensitivity of different time points, the information entropy and analytic hierarchy process method are used, respectively. In order to verify the feasibility of the proposed strategy, a health evaluating experimental system for multifunctional self-validating sensors was designed. The five different health level situations have been discussed. Successful results show that the proposed method is feasible, the HRD could be used to quantitatively indicate the health level and it does have a fast response to the performance changes of multifunctional sensors. PMID:23291576

  18. Health risk behaviours amongst school adolescents: protocol for a mixed methods study.

    PubMed

    El Achhab, Youness; El Ammari, Abdelghaffar; El Kazdouh, Hicham; Najdi, Adil; Berraho, Mohamed; Tachfouti, Nabil; Lamri, Driss; El Fakir, Samira; Nejjari, Chakib

    2016-11-29

    Determining risky behaviours of adolescents provides valuable information for designing appropriate intervention programmes for advancing adolescent's health. However, these behaviours are not fully addressed by researchers in a comprehensive approach. We report the protocol of a mixed methods study designed to investigate the health risk behaviours of Moroccan adolescents with the goal of identifying suitable strategies to address their health concerns. We used a sequential two-phase explanatory mixed method study design. The approach begins with the collection of quantitative data, followed by the collection of qualitative data to explain and enrich the quantitative findings. In the first phase, the global school-based student health survey (GSHS) was administered to 800 students who were between 14 and 19 years of age. The second phase engaged adolescents, parents and teachers in focus groups and assessed education documents to explore the level of coverage of health education in the programme learnt in the middle school. To obtain opinions about strategies to reduce Moroccan adolescents' health risk behaviours, a nominal group technique will be used. The findings of this mixed methods sequential explanatory study provide insights into the risk behaviours that need to be considered if intervention programmes and preventive strategies are to be designed to promote adolescent's health in the Moroccan school.

  19. Global, long-term surface reflectance records from Landsat

    USDA-ARS?s Scientific Manuscript database

    Global, long-term monitoring of changes in Earth’s land surface requires quantitative comparisons of satellite images acquired under widely varying atmospheric conditions. Although physically based estimates of surface reflectance (SR) ultimately provide the most accurate representation of Earth’s s...

  20. Fast Paleogene Motion of the Pacific Hotspots from Revised Global Plate Circuit Constraints

    NASA Technical Reports Server (NTRS)

    Raymond, C.; Stock, J.; Cande, S.

    2000-01-01

    Major improvements in late Cretaceous-early Tertiary Pacific-Antarctica plate reconstructions, and new East-West Antarctica rotations, allow a more definitive test of the relative motion between hotspots using global plate circuit reconstructions with quantitative uncertainties.

  1. Global soil-climate-biome diagram: linking soil properties to climate and biota

    NASA Astrophysics Data System (ADS)

    Zhao, X.; Yang, Y.; Fang, J.

    2017-12-01

    As a critical component of the Earth system, soils interact strongly with both climate and biota and provide fundamental ecosystem services that maintain food, climate, and human security. Despite significant progress in digital soil mapping techniques and the rapidly growing quantity of observed soil information, quantitative linkages between soil properties, climate and biota at the global scale remain unclear. By compiling a large global soil database, we mapped seven major soil properties (bulk density [BD]; sand, silt and clay fractions; soil pH; soil organic carbon [SOC] density [SOCD]; and soil total nitrogen [STN] density [STND]) based on machine learning algorithms (regional random forest [RF] model) and quantitatively assessed the linkage between soil properties, climate and biota at the global scale. Our results demonstrated a global soil-climate-biome diagram, which improves our understanding of the strong correspondence between soils, climate and biomes. Soil pH decreased with greater mean annual precipitation (MAP) and lower mean annual temperature (MAT), and the critical MAP for the transition from alkaline to acidic soil pH decreased with decreasing MAT. Specifically, the critical MAP ranged from 400-500 mm when the MAT exceeded 10 °C but could decrease to 50-100 mm when the MAT was approximately 0 °C. SOCD and STND were tightly linked; both increased in accordance with lower MAT and higher MAP across terrestrial biomes. Global stocks of SOC and STN were estimated to be 788 ± 39.4 Pg (1015 g, or billion tons) and 63 ± 3.3 Pg in the upper 30-cm soil layer, respectively, but these values increased to 1654 ± 94.5 Pg and 133 ± 7.8 Pg in the upper 100-cm soil layer, respectively. These results reveal quantitative linkages between soil properties, climate and biota at the global scale, suggesting co-evolution of the soil, climate and biota under conditions of global environmental change.

  2. Summary of Quantitative Interpretation of Image Far Ultraviolet Auroral Data

    NASA Technical Reports Server (NTRS)

    Frey, H. U.; Immel, T. J.; Mende, S. B.; Gerard, J.-C.; Hubert, B.; Habraken, S.; Span, J.; Gladstone, G. R.; Bisikalo, D. V.; Shematovich, V. I.; hide

    2002-01-01

    Direct imaging of the magnetosphere by instruments on the IMAGE spacecraft is supplemented by simultaneous observations of the global aurora in three far ultraviolet (FUV) wavelength bands. The purpose of the multi-wavelength imaging is to study the global auroral particle and energy input from thc magnetosphere into the atmosphere. This paper describes provides the method for quantitative interpretation of FUV measurements. The Wide-Band Imaging Camera (WIC) provides broad band ultraviolet images of the aurora with maximum spatial and temporal resolution by imaging the nitrogen lines and bands between 140 and 180 nm wavelength. The Spectrographic Imager (SI), a dual wavelength monochromatic instrument, images both Doppler-shifted Lyman alpha emissions produced by precipitating protons, in the SI-12 channel and OI 135.6 nm emissions in the SI-13 channel. From the SI-12 Doppler shifted Lyman alpha images it is possible to obtain the precipitating proton flux provided assumptions are made regarding the mean energy of the protons. Knowledge of the proton (flux and energy) component allows the calculation of the contribution produced by protons in the WIC and SI-13 instruments. Comparison of the corrected WIC and SI-13 signals provides a measure of the electron mean energy, which can then be used to determine the electron energy fluxun-. To accomplish this reliable modeling emission modeling and instrument calibrations are required. In-flight calibration using early-type stars was used to validate the pre-flight laboratory calibrations and determine long-term trends in sensitivity. In general, very reasonable agreement is found between in-situ measurements and remote quantitative determinations.

  3. Regional Ventilation Changes in the Lung: Treatment Response Mapping by Using Hyperpolarized Gas MR Imaging as a Quantitative Biomarker.

    PubMed

    Horn, Felix C; Marshall, Helen; Collier, Guilhem J; Kay, Richard; Siddiqui, Salman; Brightling, Christopher E; Parra-Robles, Juan; Wild, Jim M

    2017-09-01

    Purpose To assess the magnitude of regional response to respiratory therapeutic agents in the lungs by using treatment response mapping (TRM) with hyperpolarized gas magnetic resonance (MR) imaging. TRM was used to quantify regional physiologic response in adults with asthma who underwent a bronchodilator challenge. Materials and Methods This study was approved by the national research ethics committee and was performed with informed consent. Imaging was performed in 20 adult patients with asthma by using hyperpolarized helium 3 ( 3 He) ventilation MR imaging. Two sets of baseline images were acquired before inhalation of a bronchodilating agent (salbutamol 400 μg), and one set was acquired after. All images were registered for voxelwise comparison. Regional treatment response, ΔR(r), was calculated as the difference in regional gas distribution (R[r] = ratio of inhaled gas to total volume of a voxel when normalized for lung inflation volume) before and after intervention. A voxelwise activation threshold from the variability of the baseline images was applied to ΔR(r) maps. The summed global treatment response map (ΔR net ) was then used as a global lung index for comparison with metrics of bronchodilator response measured by using spirometry and the global imaging metric percentage ventilated volume (%VV). Results ΔR net showed significant correlation (P < .01) with changes in forced expiratory volume in 1 second (r = 0.70), forced vital capacity (r = 0.84), and %VV (r = 0.56). A significant (P < .01) positive treatment effect was detected with all metrics; however, ΔR net showed a lower intersubject coefficient of variation (64%) than all of the other tests (coefficient of variation, ≥99%). Conclusion TRM provides regional quantitative information on changes in inhaled gas ventilation in response to therapy. This method could be used as a sensitive regional outcome metric for novel respiratory interventions. © RSNA, 2017 Online supplemental material is available for this article.

  4. Common species link global ecosystems to climate change: dynamical evidence in the planktonic fossil record.

    PubMed

    Hannisdal, Bjarte; Haaga, Kristian Agasøster; Reitan, Trond; Diego, David; Liow, Lee Hsiang

    2017-07-12

    Common species shape the world around us, and changes in their commonness signify large-scale shifts in ecosystem structure and function. However, our understanding of long-term ecosystem response to environmental forcing in the deep past is centred on species richness, neglecting the disproportional impact of common species. Here, we use common and widespread species of planktonic foraminifera in deep-sea sediments to track changes in observed global occupancy (proportion of sampled sites at which a species is present and observed) through the turbulent climatic history of the last 65 Myr. Our approach is sensitive to relative changes in global abundance of the species set and robust to factors that bias richness estimators. Using three independent methods for detecting causality, we show that the observed global occupancy of planktonic foraminifera has been dynamically coupled to past oceanographic changes captured in deep-ocean temperature reconstructions. The causal inference does not imply a direct mechanism, but is consistent with an indirect, time-delayed causal linkage. Given the strong quantitative evidence that a dynamical coupling exists, we hypothesize that mixotrophy (symbiont hosting) may be an ecological factor linking the global abundance of planktonic foraminifera to long-term climate changes via the relative extent of oligotrophic oceans. © 2017 The Authors.

  5. Microdialysis as an Important Technique in Systems Pharmacology-a Historical and Methodological Review.

    PubMed

    Hammarlund-Udenaes, Margareta

    2017-09-01

    Microdialysis has contributed with very important knowledge to the understanding of target-specific concentrations and their relationship to pharmacodynamic effects from a systems pharmacology perspective, aiding in the global understanding of drug effects. This review focuses on the historical development of microdialysis as a method to quantify the pharmacologically very important unbound tissue concentrations and of recent findings relating to modeling microdialysis data to extrapolate from rodents to humans, understanding distribution of drugs in different tissues and disease conditions. Quantitative microdialysis developed very rapidly during the early 1990s. Method development was in focus in the early years including development of quantitative microdialysis, to be able to estimate true extracellular concentrations. Microdialysis has significantly contributed to the understanding of active transport at the blood-brain barrier and in other organs. Examples are presented where microdialysis together with modeling has increased the knowledge on tissue distribution between species, in overweight patients and in tumors, and in metabolite contribution to drug effects. More integrated metabolomic studies are still sparse within the microdialysis field, although a great potential for tissue and disease-specific measurements is evident.

  6. Spatial and temporal patterns of stranded intertidal marine debris: is there a picture of global change?

    PubMed

    Browne, Mark Anthony; Chapman, M Gee; Thompson, Richard C; Amaral Zettler, Linda A; Jambeck, Jenna; Mallos, Nicholas J

    2015-06-16

    Floating and stranded marine debris is widespread. Increasing sea levels and altered rainfall, solar radiation, wind speed, waves, and oceanic currents associated with climatic change are likely to transfer more debris from coastal cities into marine and coastal habitats. Marine debris causes economic and ecological impacts, but understanding the scope of these requires quantitative information on spatial patterns and trends in the amounts and types of debris at a global scale. There are very few large-scale programs to measure debris, but many peer-reviewed and published scientific studies of marine debris describe local patterns. Unfortunately, methods of defining debris, sampling, and interpreting patterns in space or time vary considerably among studies, yet if data could be synthesized across studies, a global picture of the problem may be avaliable. We analyzed 104 published scientific papers on marine debris in order to determine how to evaluate this. Although many studies were well designed to answer specific questions, definitions of what constitutes marine debris, the methods used to measure, and the scale of the scope of the studies means that no general picture can emerge from this wealth of data. These problems are detailed to guide future studies and guidelines provided to enable the collection of more comparable data to better manage this growing problem.

  7. Comparison of Image Processing Techniques for Nonviable Tissue Quantification in Late Gadolinium Enhancement Cardiac Magnetic Resonance Images.

    PubMed

    Carminati, M Chiara; Boniotti, Cinzia; Fusini, Laura; Andreini, Daniele; Pontone, Gianluca; Pepi, Mauro; Caiani, Enrico G

    2016-05-01

    The aim of this study was to compare the performance of quantitative methods, either semiautomated or automated, for left ventricular (LV) nonviable tissue analysis from cardiac magnetic resonance late gadolinium enhancement (CMR-LGE) images. The investigated segmentation techniques were: (i) n-standard deviations thresholding; (ii) full width at half maximum thresholding; (iii) Gaussian mixture model classification; and (iv) fuzzy c-means clustering. These algorithms were applied either in each short axis slice (single-slice approach) or globally considering the entire short-axis stack covering the LV (global approach). CMR-LGE images from 20 patients with ischemic cardiomyopathy were retrospectively selected, and results from each technique were assessed against manual tracing. All methods provided comparable performance in terms of accuracy in scar detection, computation of local transmurality, and high correlation in scar mass compared with the manual technique. In general, no significant difference between single-slice and global approach was noted. The reproducibility of manual and investigated techniques was confirmed in all cases with slightly lower results for the nSD approach. Automated techniques resulted in accurate and reproducible evaluation of LV scars from CMR-LGE in ischemic patients with performance similar to the manual technique. Their application could minimize user interaction and computational time, even when compared with semiautomated approaches.

  8. Enhancing SMAP Soil Moisture Retrievals via Superresolution Techniques

    NASA Astrophysics Data System (ADS)

    Beale, K. D.; Ebtehaj, A. M.; Romberg, J. K.; Bras, R. L.

    2017-12-01

    Soil moisture is a key state variable that modulates land-atmosphere interactions and its high-resolution global scale estimates are essential for improved weather forecasting, drought prediction, crop management, and the safety of troop mobility. Currently, NASA's Soil Moisture Active/Passive (SMAP) satellite provides a global picture of soil moisture variability at a resolution of 36 km, which is prohibitive for some hydrologic applications. The goal of this research is to enhance the resolution of SMAP passive microwave retrievals by a factor of 2 to 4 using modern superresolution techniques that rely on the knowledge of high-resolution land surface models. In this work, we explore several super-resolution techniques including an empirical dictionary method, a learned dictionary method, and a three-layer convolutional neural network. Using a year of global high-resolution land surface model simulations as training set, we found that we are able to produce high-resolution soil moisture maps that outperform the original low-resolution observations both qualitatively and quantitatively. In particular, on a patch-by-patch basis we are able to produce estimates of high-resolution soil moisture maps that improve on the original low-resolution patches by on average 6% in terms of mean-squared error, and 14% in terms of the structural similarity index.

  9. Discussion of skill improvement in marine ecosystem dynamic models based on parameter optimization and skill assessment

    NASA Astrophysics Data System (ADS)

    Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen

    2016-07-01

    Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.

  10. A spatial method to calculate small-scale fisheries effort in data poor scenarios.

    PubMed

    Johnson, Andrew Frederick; Moreno-Báez, Marcia; Giron-Nava, Alfredo; Corominas, Julia; Erisman, Brad; Ezcurra, Exequiel; Aburto-Oropeza, Octavio

    2017-01-01

    To gauge the collateral impacts of fishing we must know where fishing boats operate and how much they fish. Although small-scale fisheries land approximately the same amount of fish for human consumption as industrial fleets globally, methods of estimating their fishing effort are comparatively poor. We present an accessible, spatial method of calculating the effort of small-scale fisheries based on two simple measures that are available, or at least easily estimated, in even the most data-poor fisheries: the number of boats and the local coastal human population. We illustrate the method using a small-scale fisheries case study from the Gulf of California, Mexico, and show that our measure of Predicted Fishing Effort (PFE), measured as the number of boats operating in a given area per day adjusted by the number of people in local coastal populations, can accurately predict fisheries landings in the Gulf. Comparing our values of PFE to commercial fishery landings throughout the Gulf also indicates that the current number of small-scale fishing boats in the Gulf is approximately double what is required to land theoretical maximum fish biomass. Our method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This new method provides an important first step towards estimating the fishing effort of small-scale fleets globally.

  11. Global Analysis of a-, b-, and c-Type Transitions Involving Tunneling Components of K= 0 and 1 States of the Methanol Dimer

    NASA Astrophysics Data System (ADS)

    Lugez, C. L.; Lovas, F. J.; Hougen, J. T.; Ohashi, N.

    1999-03-01

    Spectral data onK= 0 and 1 levels of the methanol dimer available from previous and present Fourier transform microwave measurements have been interpreted globally, using a group-theoretically derived effective Hamiltonian and corresponding tunneling matrix elements to describe the splittings arising from a large number of tunneling motions. In the present work, 302 new measurements (40K= 1-1 and 262K= 1-0 transitions) were added to the previous data set to give a total of 584 assigned transitions withJ≤ 6. As a result of the rather completeK= 0, 1 data set forJ≤ 4, the lone-pair exchange tunneling splittings were obtained experimentally. Matrix element expansions inJ(J+ 1) used in the previousK= 0 formalism were modified to apply toK> 0, essentially by making a number of real coefficients complex, as required by the generalized internal-axis-method tunneling formalism. To reduce the number of adjustable parameters to an acceptable level in both theK= 0 andK= 1 effective Hamiltonians (used in separateK= 0 andK= 1 least-squares fits), a rather large number of assumptions concerning probably negligible parameters had to be made. The present fitting results should thus be considered as providing assurance of the group-theoretical line assignments as well as a nearly quantitative global interpretation of the tunneling splittings, even though they do not yet unambiguously determine the relative contributions from all 25 group-theoretically inequivalent tunneling motions in this complex, nor do they permit quantitative extrapolation to higherKlevels.

  12. Agenda, extended abstracts, and bibliographies for a workshop on Deposit modeling, mineral resources assessment, and their role in sustainable development

    USGS Publications Warehouse

    Briskey, Joseph A.; Schulz, Klaus J.

    2002-01-01

    Global demand for mineral resources continues to increase because of increasing global population and the desire and efforts to improve living standards worldwide. The ability to meet this growing demand for minerals is affected by the concerns about possible environmental degradation associated with minerals production and by competing land uses. Informed planning and decisions concerning sustainability and resource development require a long-term perspective and an integrated approach to land-use, resource, and environmental management worldwide. This, in turn, requires unbiased information on the global distribution of identified and especially undiscovered resources, the economic and political factors influencing their development, and the potential environmental consequences of their exploitation. The purpose of the IGC workshop is to review the state-of-the-art in mineral-deposit modeling and quantitative resource assessment and to examine their role in the sustainability of mineral use. The workshop will address such questions as: Which of the available mineral-deposit models and assessment methods are best suited for predicting the locations, deposit types, and amounts of undiscovered nonfuel mineral resources remaining in the world? What is the availability of global geologic, mineral deposit, and mineral-exploration information? How can mineral-resource assessments be used to address economic and environmental issues? Presentations will include overviews of assessment methods used in previous national and other small-scale assessments of large regions as well as resulting assessment products and their uses.

  13. Global morphological analysis of marine viruses shows minimal regional variation and dominance of non-tailed viruses.

    PubMed

    Brum, Jennifer R; Schenck, Ryan O; Sullivan, Matthew B

    2013-09-01

    Viruses influence oceanic ecosystems by causing mortality of microorganisms, altering nutrient and organic matter flux via lysis and auxiliary metabolic gene expression and changing the trajectory of microbial evolution through horizontal gene transfer. Limited host range and differing genetic potential of individual virus types mean that investigations into the types of viruses that exist in the ocean and their spatial distribution throughout the world's oceans are critical to understanding the global impacts of marine viruses. Here we evaluate viral morphological characteristics (morphotype, capsid diameter and tail length) using a quantitative transmission electron microscopy (qTEM) method across six of the world's oceans and seas sampled through the Tara Oceans Expedition. Extensive experimental validation of the qTEM method shows that neither sample preservation nor preparation significantly alters natural viral morphological characteristics. The global sampling analysis demonstrated that morphological characteristics did not vary consistently with depth (surface versus deep chlorophyll maximum waters) or oceanic region. Instead, temperature, salinity and oxygen concentration, but not chlorophyll a concentration, were more explanatory in evaluating differences in viral assemblage morphological characteristics. Surprisingly, given that the majority of cultivated bacterial viruses are tailed, non-tailed viruses appear to numerically dominate the upper oceans as they comprised 51-92% of the viral particles observed. Together, these results document global marine viral morphological characteristics, show that their minimal variability is more explained by environmental conditions than geography and suggest that non-tailed viruses might represent the most ecologically important targets for future research.

  14. Robust membrane detection based on tensor voting for electron tomography.

    PubMed

    Martinez-Sanchez, Antonio; Garcia, Inmaculada; Asano, Shoh; Lucic, Vladan; Fernandez, Jose-Jesus

    2014-04-01

    Electron tomography enables three-dimensional (3D) visualization and analysis of the subcellular architecture at a resolution of a few nanometers. Segmentation of structural components present in 3D images (tomograms) is often necessary for their interpretation. However, it is severely hampered by a number of factors that are inherent to electron tomography (e.g. noise, low contrast, distortion). Thus, there is a need for new and improved computational methods to facilitate this challenging task. In this work, we present a new method for membrane segmentation that is based on anisotropic propagation of the local structural information using the tensor voting algorithm. The local structure at each voxel is then refined according to the information received from other voxels. Because voxels belonging to the same membrane have coherent structural information, the underlying global structure is strengthened. In this way, local information is easily integrated at a global scale to yield segmented structures. This method performs well under low signal-to-noise ratio typically found in tomograms of vitrified samples under cryo-tomography conditions and can bridge gaps present on membranes. The performance of the method is demonstrated by applications to tomograms of different biological samples and by quantitative comparison with standard template matching procedure. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. High-throughput label-free screening of euglena gracilis with optofluidic time-stretch quantitative phase microscopy

    NASA Astrophysics Data System (ADS)

    Guo, Baoshan; Lei, Cheng; Ito, Takuro; Yaxiaer, Yalikun; Kobayashi, Hirofumi; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke

    2017-02-01

    The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, microalgal biofuel is expected to play a key role in reducing the detrimental effects of global warming since microalgae absorb atmospheric CO2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid contents and fail to characterize a diverse population of microalgal cells with single-cell resolution in a noninvasive and interference-free manner. Here we demonstrate high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy. In particular, we use Euglena gracilis - an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement) within lipid droplets. Our optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch phase-contrast microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase contents of every single cell at a high throughput of 10,000 cells/s. We characterize heterogeneous populations of E. gracilis cells under two different culture conditions to evaluate their lipid production efficiency. Our method holds promise as an effective analytical tool for microalgaebased biofuel production.

  16. Filling the knowledge gap: Integrating quantitative genetics and genomics in graduate education and outreach

    USDA-ARS?s Scientific Manuscript database

    The genomics revolution provides vital tools to address global food security. Yet to be incorporated into livestock breeding, molecular techniques need to be integrated into a quantitative genetics framework. Within the U.S., with shrinking faculty numbers with the requisite skills, the capacity to ...

  17. Globalization in science education: an inevitable and beneficial trend.

    PubMed

    Charlton, Bruce G; Andras, Peter

    2006-01-01

    Globalization is one aspect of the larger phenomenon of modernization, which describes societies characterized by progressive growth in the complexity of communications. Despite its inevitable problems, globalization is a generally desirable phenomenon, since it enables increased efficiency, effectiveness and capability of societies and thereby, potentially benefits most people most of the time. Scientific research was one of the first global communication systems, especially at its most advanced levels. And high quality scientific education at the post-doctoral level is also now essentially global. The next steps will be for lower level science education - at doctoral, undergraduate, and even school teaching levels - to become progressively globalized. This phenomenon is already happening in the mathematical and quantitative sciences, and will probably spread to include other kinds of science. But to be efficient requires the development of a trading medium of internationally standardized and quantitative educational credits - for instance, standard certificates, objective comparative examinations, and a hierarchical qualifications structure (which will almost certainly be based on the United States system). Globalized education also requires a common language for organizational communications, which is already in place for the quantitative and mathematical sciences, and will be increasingly the case as competence in a simplified form of international scientific English becomes more universal. As such a global science education system grows there will be increased competition and migration of teachers and students. The law of comparative advantage suggests that such mobility will encourage societies to specialize in what they do best. For example, some countries (even among wealthy nations) may provide little advanced scientific education, and import the necessary expertise from abroad - this situation seems to be developing in Germany and France, who lack any top-quality research universities. Conversely, just a few countries may provide the bulk of advanced science education teaching - as well as applied and pure research personnel - for the rest of the world: potentially China and India might supply most of world's mathematical expertise. In conclusion, there are two complementary aspects to the globalization of science education: these are standardization and specialization. We anticipate a simultaneous trend towards international convergence of basic educational structures, certificates and English usage; with increasing national differentiation of specialist educational functions.

  18. Quantitative computed tomography measurements of emphysema for diagnosing asthma-chronic obstructive pulmonary disease overlap syndrome

    PubMed Central

    Xie, Mengshuang; Wang, Wei; Dou, Shuang; Cui, Liwei; Xiao, Wei

    2016-01-01

    Background The diagnostic criteria of asthma–COPD overlap syndrome (ACOS) are controversial. Emphysema is characteristic of COPD and usually does not exist in typical asthma patients. Emphysema in patients with asthma suggests the coexistence of COPD. Quantitative computed tomography (CT) allows repeated evaluation of emphysema noninvasively. We investigated the value of quantitative CT measurements of emphysema in the diagnosis of ACOS. Methods This study included 404 participants; 151 asthma patients, 125 COPD patients, and 128 normal control subjects. All the participants underwent pulmonary function tests and a high-resolution CT scan. Emphysema measurements were taken with an Airway Inspector software. The asthma patients were divided into high and low emphysema index (EI) groups based on the percentage of low attenuation areas less than −950 Hounsfield units. The characteristics of asthma patients with high EI were compared with those having low EI or COPD. Results The normal value of percentage of low attenuation areas less than −950 Hounsfield units in Chinese aged >40 years was 2.79%±2.37%. COPD patients indicated more severe emphysema and more upper-zone-predominant distribution of emphysema than asthma patients or controls. Thirty-two (21.2%) of the 151 asthma patients had high EI. Compared with asthma patients with low EI, those with high EI were significantly older, more likely to be male, had more pack-years of smoking, had more upper-zone-predominant distribution of emphysema, and had greater airflow limitation. There were no significant differences in sex ratios, pack-years of smoking, airflow limitation, or emphysema distribution between asthma patients with high EI and COPD patients. A greater number of acute exacerbations were seen in asthma patients with high EI compared with those with low EI or COPD. Conclusion Asthma patients with high EI fulfill the features of ACOS, as described in the Global Initiative for Asthma and Global Initiative for Chronic Obstructive Lung Disease guidelines. Quantitative CT measurements of emphysema may help in diagnosing ACOS. PMID:27226711

  19. Gene expression profiling of single cells on large-scale oligonucleotide arrays

    PubMed Central

    Hartmann, Claudia H.; Klein, Christoph A.

    2006-01-01

    Over the last decade, important insights into the regulation of cellular responses to various stimuli were gained by global gene expression analyses of cell populations. More recently, specific cell functions and underlying regulatory networks of rare cells isolated from their natural environment moved to the center of attention. However, low cell numbers still hinder gene expression profiling of rare ex vivo material in biomedical research. Therefore, we developed a robust method for gene expression profiling of single cells on high-density oligonucleotide arrays with excellent coverage of low abundance transcripts. The protocol was extensively tested with freshly isolated single cells of very low mRNA content including single epithelial, mature and immature dendritic cells and hematopoietic stem cells. Quantitative PCR confirmed that the PCR-based global amplification method did not change the relative ratios of transcript abundance and unsupervised hierarchical cluster analysis revealed that the histogenetic origin of an individual cell is correctly reflected by the gene expression profile. Moreover, the gene expression data from dendritic cells demonstrate that cellular differentiation and pathway activation can be monitored in individual cells. PMID:17071717

  20. A lymphocyte spatial distribution graph-based method for automated classification of recurrence risk on lung cancer images

    NASA Astrophysics Data System (ADS)

    Garciá-Arteaga, Juan D.; Corredor, Germán.; Wang, Xiangxue; Velcheti, Vamsidhar; Madabhushi, Anant; Romero, Eduardo

    2017-11-01

    Tumor-infiltrating lymphocytes occurs when various classes of white blood cells migrate from the blood stream towards the tumor, infiltrating it. The presence of TIL is predictive of the response of the patient to therapy. In this paper, we show how the automatic detection of lymphocytes in digital H and E histopathological images and the quantitative evaluation of the global lymphocyte configuration, evaluated through global features extracted from non-parametric graphs, constructed from the lymphocytes' detected positions, can be correlated to the patient's outcome in early-stage non-small cell lung cancer (NSCLC). The method was assessed on a tissue microarray cohort composed of 63 NSCLC cases. From the evaluated graphs, minimum spanning trees and K-nn showed the highest predictive ability, yielding F1 Scores of 0.75 and 0.72 and accuracies of 0.67 and 0.69, respectively. The predictive power of the proposed methodology indicates that graphs may be used to develop objective measures of the infiltration grade of tumors, which can, in turn, be used by pathologists to improve the decision making and treatment planning processes.

  1. Alternative difference analysis scheme combining R -space EXAFS fit with global optimization XANES fit for X-ray transient absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Fei; Tao, Ye; Zhao, Haifeng

    Time-resolved X-ray absorption spectroscopy (TR-XAS), based on the laser-pump/X-ray-probe method, is powerful in capturing the change of the geometrical and electronic structure of the absorbing atom upon excitation. TR-XAS data analysis is generally performed on the laser-on minus laser-off difference spectrum. Here, a new analysis scheme is presented for the TR-XAS difference fitting in both the extended X-ray absorption fine-structure (EXAFS) and the X-ray absorption near-edge structure (XANES) regions.R-space EXAFS difference fitting could quickly provide the main quantitative structure change of the first shell. The XANES fitting part introduces a global non-derivative optimization algorithm and optimizes the local structure changemore » in a flexible way where both the core XAS calculation package and the search method in the fitting shell are changeable. The scheme was applied to the TR-XAS difference analysis of Fe(phen) 3spin crossover complex and yielded reliable distance change and excitation population.« less

  2. Alternative difference analysis scheme combining R-space EXAFS fit with global optimization XANES fit for X-ray transient absorption spectroscopy.

    PubMed

    Zhan, Fei; Tao, Ye; Zhao, Haifeng

    2017-07-01

    Time-resolved X-ray absorption spectroscopy (TR-XAS), based on the laser-pump/X-ray-probe method, is powerful in capturing the change of the geometrical and electronic structure of the absorbing atom upon excitation. TR-XAS data analysis is generally performed on the laser-on minus laser-off difference spectrum. Here, a new analysis scheme is presented for the TR-XAS difference fitting in both the extended X-ray absorption fine-structure (EXAFS) and the X-ray absorption near-edge structure (XANES) regions. R-space EXAFS difference fitting could quickly provide the main quantitative structure change of the first shell. The XANES fitting part introduces a global non-derivative optimization algorithm and optimizes the local structure change in a flexible way where both the core XAS calculation package and the search method in the fitting shell are changeable. The scheme was applied to the TR-XAS difference analysis of Fe(phen) 3 spin crossover complex and yielded reliable distance change and excitation population.

  3. Will Systems Biology Deliver Its Promise and Contribute to the Development of New or Improved Vaccines? What Really Constitutes the Study of "Systems Biology" and How Might Such an Approach Facilitate Vaccine Design.

    PubMed

    Germain, Ronald N

    2017-10-16

    A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  4. 78 FR 12372 - UBS AG, et al.; Notice of Application and Temporary Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... financial planning and wealth management consulting, asset-based and advisory services and transaction-based... Limited (``ESC GP''); UBS Financial Services Inc. (``UBSFS''); UBS Alternative and Quantitative... Switzerland, is a Swiss-based global financial services firm. UBS AG and its subsidiaries provide global...

  5. Modeling ready biodegradability of fragrance materials.

    PubMed

    Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola

    2015-06-01

    In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.

  6. Retrieving the hydrous minerals on Mars by sparse unmixing and the Hapke model using MRO/CRISM data

    NASA Astrophysics Data System (ADS)

    Lin, Honglei; Zhang, Xia

    2017-05-01

    The hydrous minerals on Mars preserve records of potential past aqueous activity. Quantitative information regarding mineralogical composition would enable a better understanding of the formation processes of these hydrous minerals, and provide unique insights into ancient habitable environments and the geological evolution of Mars. The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) has the advantage of both a high spatial and spectral resolution, which makes it suitable for the quantitative analysis of minerals on Mars. However, few studies have attempted to quantitatively retrieve the mineralogical composition of hydrous minerals on Mars using visible-infrared (VISIR) hyperspectral data due to their distribution characteristics (relatively low concentrations, located primarily in Noachian terrain, and unclear or unknown background minerals) and limitations of the spectral unmixing algorithms. In this study, we developed a modified sparse unmixing (MSU) method, combining the Hapke model with sparse unmixing. The MSU method considers the nonlinear mixed effects of minerals and avoids the difficulty of determining the spectra and number of endmembers from the image. The proposed method was tested successfully using laboratory mixture spectra and an Airborne Visible Infrared Imaging Spectrometer (AVIRIS) image of the Cuprite site (Nevada, USA). Then it was applied to CRISM hyperspectral images over Gale crater. Areas of hydrous mineral distribution were first identified by spectral features of water and hydroxyl absorption. The MSU method was performed on these areas, and the abundances were retrieved. The results indicated that the hydrous minerals consisted mostly of hydrous silicates, with abundances of up to 35%, as well as hydrous sulfates, with abundances ≤10%. Several main subclasses of hydrous minerals (e.g., Fe/Mg phyllosilicate, prehnite, and kieserite) were retrieved. Among these, Fe/Mg- phyllosilicate was the most abundant, with abundances ranging up to almost 30%, followed by prehnite and kieserite, with abundances lower than 15%. Our results are consistent with related research and in situ analyses of data from the rover Curiosity; thus, our method has the potential to be widely used for quantitative mineralogical mapping at the global scale of the surface of Mars.

  7. A fast cross-validation method for alignment of electron tomography images based on Beer-Lambert law.

    PubMed

    Yan, Rui; Edwards, Thomas J; Pankratz, Logan M; Kuhn, Richard J; Lanman, Jason K; Liu, Jun; Jiang, Wen

    2015-11-01

    In electron tomography, accurate alignment of tilt series is an essential step in attaining high-resolution 3D reconstructions. Nevertheless, quantitative assessment of alignment quality has remained a challenging issue, even though many alignment methods have been reported. Here, we report a fast and accurate method, tomoAlignEval, based on the Beer-Lambert law, for the evaluation of alignment quality. Our method is able to globally estimate the alignment accuracy by measuring the goodness of log-linear relationship of the beam intensity attenuations at different tilt angles. Extensive tests with experimental data demonstrated its robust performance with stained and cryo samples. Our method is not only significantly faster but also more sensitive than measurements of tomogram resolution using Fourier shell correlation method (FSCe/o). From these tests, we also conclude that while current alignment methods are sufficiently accurate for stained samples, inaccurate alignments remain a major limitation for high resolution cryo-electron tomography. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. A fast cross-validation method for alignment of electron tomography images based on Beer-Lambert law

    PubMed Central

    Yan, Rui; Edwards, Thomas J.; Pankratz, Logan M.; Kuhn, Richard J.; Lanman, Jason K.; Liu, Jun; Jiang, Wen

    2015-01-01

    In electron tomography, accurate alignment of tilt series is an essential step in attaining high-resolution 3D reconstructions. Nevertheless, quantitative assessment of alignment quality has remained a challenging issue, even though many alignment methods have been reported. Here, we report a fast and accurate method, tomoAlignEval, based on the Beer-Lambert law, for the evaluation of alignment quality. Our method is able to globally estimate the alignment accuracy by measuring the goodness of log-linear relationship of the beam intensity attenuations at different tilt angles. Extensive tests with experimental data demonstrated its robust performance with stained and cryo samples. Our method is not only significantly faster but also more sensitive than measurements of tomogram resolution using Fourier shell correlation method (FSCe/o). From these tests, we also conclude that while current alignment methods are sufficiently accurate for stained samples, inaccurate alignments remain a major limitation for high resolution cryo-electron tomography. PMID:26455556

  9. Comparison of Quantitative Mass Spectrometry Platforms for Monitoring Kinase ATP Probe Uptake in Lung Cancer.

    PubMed

    Hoffman, Melissa A; Fang, Bin; Haura, Eric B; Rix, Uwe; Koomen, John M

    2018-01-05

    Recent developments in instrumentation and bioinformatics have led to new quantitative mass spectrometry platforms including LC-MS/MS with data-independent acquisition (DIA) and targeted analysis using parallel reaction monitoring mass spectrometry (LC-PRM), which provide alternatives to well-established methods, such as LC-MS/MS with data-dependent acquisition (DDA) and targeted analysis using multiple reaction monitoring mass spectrometry (LC-MRM). These tools have been used to identify signaling perturbations in lung cancers and other malignancies, supporting the development of effective kinase inhibitors and, more recently, providing insights into therapeutic resistance mechanisms and drug repurposing opportunities. However, detection of kinases in biological matrices can be challenging; therefore, activity-based protein profiling enrichment of ATP-utilizing proteins was selected as a test case for exploring the limits of detection of low-abundance analytes in complex biological samples. To examine the impact of different MS acquisition platforms, quantification of kinase ATP uptake following kinase inhibitor treatment was analyzed by four different methods: LC-MS/MS with DDA and DIA, LC-MRM, and LC-PRM. For discovery data sets, DIA increased the number of identified kinases by 21% and reduced missingness when compared with DDA. In this context, MRM and PRM were most effective at identifying global kinome responses to inhibitor treatment, highlighting the value of a priori target identification and manual evaluation of quantitative proteomics data sets. We compare results for a selected set of desthiobiotinylated peptides from PRM, MRM, and DIA and identify considerations for selecting a quantification method and postprocessing steps that should be used for each data acquisition strategy.

  10. Problem solving for breast health care delivery in low and middle resource countries (LMCs): consensus statement from the Breast Health Global Initiative.

    PubMed

    Harford, Joe B; Otero, Isabel V; Anderson, Benjamin O; Cazap, Eduardo; Gradishar, William J; Gralow, Julie R; Kane, Gabrielle M; Niëns, Laurens M; Porter, Peggy L; Reeler, Anne V; Rieger, Paula T; Shockney, Lillie D; Shulman, Lawrence N; Soldak, Tanya; Thomas, David B; Thompson, Beti; Winchester, David P; Zelle, Sten G; Badwe, Rajendra A

    2011-04-01

    International collaborations like the Breast Health Global Initiative (BHGI) can help low and middle income countries (LMCs) to establish or improve breast cancer control programs by providing evidence-based, resource-stratified guidelines for the management and control of breast cancer. The Problem Solving Working Group of the BHGI 2010 Global Summit met to develop a consensus statement on problem-solving strategies addressing breast cancer in LMCs. To better assess breast cancer burden in poorly studied populations, countries require accurate statistics regarding breast cancer incidence and mortality. To better identify health care system strengths and weaknesses, countries require reasonable indicators of true health system quality and capacity. Using qualitative and quantitative research methods, countries should formulate cancer control strategies to identify both system inefficiencies and patient barriers. Patient navigation programs linked to public advocacy efforts feed and strengthen functional early detection and treatment programs. Cost-effectiveness research and implementation science are tools that can guide and expand successful pilot programs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals

    PubMed Central

    Crooks, Kevin R.; Burdett, Christopher L.; Theobald, David M.; King, Sarah R. B.; Rondinini, Carlo; Boitani, Luigi

    2017-01-01

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world’s terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world’s terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation. PMID:28673992

  12. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals.

    PubMed

    Crooks, Kevin R; Burdett, Christopher L; Theobald, David M; King, Sarah R B; Di Marco, Moreno; Rondinini, Carlo; Boitani, Luigi

    2017-07-18

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world's terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world's terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation.

  13. Global Conformational Selection and Local Induced Fit for the Recognition between Intrinsic Disordered p53 and CBP

    PubMed Central

    Yu, Qingfen; Ye, Wei; Wang, Wei; Chen, Hai-Feng

    2013-01-01

    The transactivation domain (TAD) of tumor suppressor p53 can bind with the nuclear coactivator binding domain (NCBD) of cyclic-AMP response element binding protein (CBP) and activate transcription. NMR experiments demonstrate that both apo-NCBD and TAD are intrinsic disordered and bound NCBD/TAD undergoes a transition to well folded. The recognition mechanism between intrinsic disordered proteins is still hotly debated. Molecular dynamics (MD) simulations in explicit solvent are used to study the recognition mechanism between intrinsic disordered TAD and NCBD. The average RMSD values between bound and corresponding apo states and Kolmogorov-Smirnov P test analysis indicate that TAD and NCBD may follow an induced fit mechanism. Quantitative analysis indicates there is also a global conformational selection. In summary, the recognition of TAD and NCBD might obey a local induced fit and global conformational selection. These conclusions are further supported by high-temperature unbinding kinetics and room temperature landscape analysis. These methods can be used to study the recognition mechanism of other intrinsic disordered proteins. PMID:23555731

  14. A global time-dependent model of thunderstorm electricity. I - Mathematical properties of the physical and numerical models

    NASA Technical Reports Server (NTRS)

    Browning, G. L.; Tzur, I.; Roble, R. G.

    1987-01-01

    A time-dependent model is introduced that can be used to simulate the interaction of a thunderstorm with its global electrical environment. The model solves the continuity equation of the Maxwell current, which is assumed to be composed of the conduction, displacement, and source currents. Boundary conditions which can be used in conjunction with the continuity equation to form a well-posed initial-boundary value problem are determined. Properties of various components of solutions of the initial-boundary value problem are analytically determined. The results indicate that the problem has two time scales, one determined by the background electrical conductivity and the other by the time variation of the source function. A numerical method for obtaining quantitative results is introduced, and its properties are studied. Some simulation results on the evolution of the displacement and conduction currents during the electrification of a storm are presented.

  15. Multidimensional Scaling Analysis of the Dynamics of a Country Economy

    PubMed Central

    Mata, Maria Eugénia

    2013-01-01

    This paper analyzes the Portuguese short-run business cycles over the last 150 years and presents the multidimensional scaling (MDS) for visualizing the results. The analytical and numerical assessment of this long-run perspective reveals periods with close connections between the macroeconomic variables related to government accounts equilibrium, balance of payments equilibrium, and economic growth. The MDS method is adopted for a quantitative statistical analysis. In this way, similarity clusters of several historical periods emerge in the MDS maps, namely, in identifying similarities and dissimilarities that identify periods of prosperity and crises, growth, and stagnation. Such features are major aspects of collective national achievement, to which can be associated the impact of international problems such as the World Wars, the Great Depression, or the current global financial crisis, as well as national events in the context of broad political blueprints for the Portuguese society in the rising globalization process. PMID:24294132

  16. Global grey matter volume in adult bipolar patients with and without lithium treatment: A meta-analysis.

    PubMed

    Sun, Yue Ran; Herrmann, Nathan; Scott, Christopher J M; Black, Sandra E; Khan, Maisha M; Lanctôt, Krista L

    2018-01-01

    The goal of this meta-analysis was to quantitatively summarize the evidence available on the differences in grey matter volume between lithium-treated and lithium-free bipolar patients. A systematic search was conducted in Cochrane Central, Embase, MEDLINE, and PsycINFO databases for original peer-reviewed journal articles that reported on global grey matter volume in lithium-medicated and lithium-free bipolar patients. Standard mean difference and Hedges' g were used to calculate effect size in a random-effects model. Risk of publication bias was assessed using Egger's test and quality of evidence was assessed using standard criteria. There were 15 studies with a total of 854 patients (368 lithium-medicated, 486 lithium-free) included in the meta-analysis. Global grey matter volume was significantly larger in lithium-treated bipolar patients compared to lithium-free patients (SMD: 0.17, 95% CI: 0.01-0.33; z = 2.11, p = 0.035). Additionally, there was a difference in global grey matter volume between groups in studies that employed semi-automated segmentation methods (SMD: 0.66, 95% CI: 0.01-1.31; z = 1.99, p = 0.047), but no significant difference in studies that used fully-automated segmentation. No publication bias was detected (bias coefficient = - 0.65, p = 0.46). Variability in imaging methods and lack of high-quality evidence limits the interpretation of the findings. Results suggest that lithium-treated patients have a greater global grey matter volume than those who were lithium-free. Further study of the relationship between lithium and grey matter volume may elucidate the therapeutic potential of lithium in conditions characterized by abnormal changes in brain structure. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  17. Overestimation of molecular and modelling methods and underestimation of traditional taxonomy leads to real problems in assessing and handling of the world's biodiversity.

    PubMed

    Löbl, Ivan

    2014-02-27

    Since the 1992 Rio Convention on Biological Diversity, the earth's biodiversity is a matter of constant public interest, but the community of scientists who describe and delimit species in mega-diverse animal groups, i.e. the bulk of global biodiversity, faces ever-increasing impediments. The problems are rooted in poor understanding of specificity of taxonomy, and overestimation of quantitative approaches and modern technology. A high proportion of the animal species still remains to be discovered and studied, so a more balanced approach to the situation is needed.

  18. Translation into Brazilian Portuguese and validation of the "Quantitative Global Scarring Grading System for Post-acne Scarring" *

    PubMed Central

    Cachafeiro, Thais Hofmann; Escobar, Gabriela Fortes; Maldonado, Gabriela; Cestari, Tania Ferreira

    2014-01-01

    The "Quantitative Global Scarring Grading System for Postacne Scarring" was developed in English for acne scar grading, based on the number and severity of each type of scar. The aims of this study were to translate this scale into Brazilian Portuguese and verify its reliability and validity. The study followed five steps: Translation, Expert Panel, Back Translation, Approval of authors and Validation. The translated scale showed high internal consistency and high test-retest reliability, confirming its reproducibility. Therefore, it has been validated for our population and can be recommended as a reliable instrument to assess acne scarring. PMID:25184939

  19. The Ether Wind and the Global Positioning System.

    ERIC Educational Resources Information Center

    Muller, Rainer

    2000-01-01

    Explains how students can perform a refutation of the ether theory using information from the Global Positioning System (GPS). Discusses the functioning of the GPS, qualitatively describes how position determination would be affected by an ether wind, and illustrates the pertinent ideas with a simple quantitative model. (WRM)

  20. Evaluating Sustainable Development Solutions Quantitatively: Competence Modelling for GCE and ESD

    ERIC Educational Resources Information Center

    Böhm, Marko; Eggert, Sabina; Barkmann, Jan; Bögeholz, Susanne

    2016-01-01

    To comprehensively address global environmental challenges such as biodiversity loss, citizens need an understanding of the socio-economic fundamentals of human behaviour in relation to natural resources. We argue that Global Citizenship Education and Education for Sustainable Development provide a core set of socio-economic competencies that can…

  1. Good match exploration for infrared face recognition

    NASA Astrophysics Data System (ADS)

    Yang, Changcai; Zhou, Huabing; Sun, Sheng; Liu, Renfeng; Zhao, Ji; Ma, Jiayi

    2014-11-01

    Establishing good feature correspondence is a critical prerequisite and a challenging task for infrared (IR) face recognition. Recent studies revealed that the scale invariant feature transform (SIFT) descriptor outperforms other local descriptors for feature matching. However, it only uses local appearance information for matching, and hence inevitably leads to a number of false matches. To address this issue, this paper explores global structure information (GSI) among SIFT correspondences, and proposes a new method SIFT-GSI for good match exploration. This is achieved by fitting a smooth mapping function for the underlying correct matches, which involves softassign and deterministic annealing. Quantitative comparisons with state-of-the-art methods on a publicly available IR human face database demonstrate that SIFT-GSI significantly outperforms other methods for feature matching, and hence it is able to improve the reliability of IR face recognition systems.

  2. Lipidomics by ultrahigh performance liquid chromatography-high resolution mass spectrometry and its application to complex biological samples

    PubMed Central

    Triebl, Alexander; Trötzmüller, Martin; Hartler, Jürgen; Stojakovic, Tatjana; Köfeler, Harald C

    2018-01-01

    An improved approach for selective and sensitive identification and quantitation of lipid molecular species using reversed phase chromatography coupled to high resolution mass spectrometry was developed. The method is applicable to a wide variety of biological matrices using a simple liquid-liquid extraction procedure. Together, this approach combines three selectivity criteria: Reversed phase chromatography separates lipids according to their acyl chain length and degree of unsaturation and is capable of resolving positional isomers of lysophospholipids, as well as structural isomers of diacyl phospholipids and glycerolipids. Orbitrap mass spectrometry delivers the elemental composition of both positive and negative ions with high mass accuracy. Finally, automatically generated tandem mass spectra provide structural insight into numerous glycerolipids, phospholipids, and sphingolipids within a single run. Method validation resulted in a linearity range of more than four orders of magnitude, good values for accuracy and precision at biologically relevant concentration levels, and limits of quantitation of a few femtomoles on column. Hundreds of lipid molecular species were detected and quantified in three different biological matrices, which cover well the wide variety and complexity of various model organisms in lipidomic research. Together with a reliable software package, this method is a prime choice for global lipidomic analysis of even the most complex biological samples. PMID:28415015

  3. Deferasirox, deferiprone and desferrioxamine treatment in thalassemia major patients: cardiac iron and function comparison determined by quantitative magnetic resonance imaging

    PubMed Central

    Pepe, Alessia; Meloni, Antonella; Capra, Marcello; Cianciulli, Paolo; Prossomariti, Luciano; Malaventura, Cristina; Putti, Maria Caterina; Lippi, Alma; Romeo, Maria Antonietta; Bisconte, Maria Grazia; Filosa, Aldo; Caruso, Vincenzo; Quarta, Antonella; Pitrolo, Lorella; Missere, Massimiliano; Midiri, Massimo; Rossi, Giuseppe; Positano, Vincenzo; Lombardi, Massimo; Maggio, Aurelio

    2011-01-01

    Background Oral deferiprone was suggested to be more effective than subcutaneous desferrioxamine for removing heart iron. Oral once-daily chelator deferasirox has recently been made commercially available but its long-term efficacy on cardiac iron and function has not yet been established. Our study aimed to compare the effectiveness of deferasirox, deferiprone and desferrioxamine on myocardial and liver iron concentrations and bi-ventricular function in thalassemia major patients by means of quantitative magnetic resonance imaging. Design and Methods From the first 550 thalassemia subjects enrolled in the Myocardial Iron Overload in Thalassemia network, we retrospectively selected thalassemia major patients who had been receiving one chelator alone for longer than one year. We identified three groups of patients: 24 treated with deferasirox, 42 treated with deferiprone and 89 treated with desferrioxamine. Myocardial iron concentrations were measured by T2* multislice multiecho technique. Biventricular function parameters were quantitatively evaluated by cine images. Liver iron concentrations were measured by T2* multiecho technique. Results The global heart T2* value was significantly higher in the deferiprone (34±11ms) than in the deferasirox (21±12 ms) and the desferrioxamine groups (27±11 ms) (P=0.0001). We found higher left ventricular ejection fractions in the deferiprone and the desferrioxamine versus the deferasirox group (P=0.010). Liver iron concentration, measured as T2* signal, was significantly lower in the desferrioxamine versus the deferiprone and the deferasirox group (P=0.004). Conclusions The cohort of patients treated with oral deferiprone showed less myocardial iron burden and better global systolic ventricular function compared to the patients treated with oral deferasirox or subcutaneous desferrioxamine. PMID:20884710

  4. Development of a quantitative fluorescence single primer isothermal amplification-based method for the detection of Salmonella.

    PubMed

    Wang, Jianchang; Li, Rui; Hu, Lianxia; Sun, Xiaoxia; Wang, Jinfeng; Li, Jing

    2016-02-16

    Food-borne disease caused by Salmonella has long been, and continues to be, an important global public health problem, necessitating rapid and accurate detection of Salmonella in food. Real time PCR is the most recently developed approach for Salmonella detection. Single primer isothermal amplification (SPIA), a novel gene amplification technique, has emerged as an attractive microbiological testing method. SPIA is performed under a constant temperature, eliminating the need for an expensive thermo-cycler. In addition, SPIA reactions can be accomplished in 30 min, faster than real time PCR that usually takes over 2h. We developed a quantitative fluorescence SPIA-based method for the detection of Salmonella. Using Salmonella Typhimurium genomic DNA as template and a primer targeting Salmonella invA gene, we showed the detection limit of SPIA was 2.0 × 10(1)fg DNA. Its successful amplification of different serotypic Salmonella genomic DNA but not non-Salmonella bacterial DNA demonstrated the specificity of SPIA. Furthermore, this method was validated with artificially contaminated beef. In conclusion, we showed high sensitivity and specificity of SPIA in the detection of Salmonella, comparable to real time PCR. In addition, SPIA is faster and more cost-effective (non-use of expensive cyclers), making it a potential alternative for field detection of Salmonella in resource-limited settings that are commonly encountered in developing countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Micro-CT Imaging Reveals Mekk3 Heterozygosity Prevents Cerebral Cavernous Malformations in Ccm2-Deficient Mice

    PubMed Central

    Choi, Jaesung P.; Foley, Matthew; Zhou, Zinan; Wong, Weng-Yew; Gokoolparsadh, Naveena; Arthur, J. Simon C.; Li, Dean Y.; Zheng, Xiangjian

    2016-01-01

    Mutations in CCM1 (aka KRIT1), CCM2, or CCM3 (aka PDCD10) gene cause cerebral cavernous malformation in humans. Mouse models of CCM disease have been established by deleting Ccm genes in postnatal animals. These mouse models provide invaluable tools to investigate molecular mechanism and therapeutic approaches for CCM disease. However, the full value of these animal models is limited by the lack of an accurate and quantitative method to assess lesion burden and progression. In the present study we have established a refined and detailed contrast enhanced X-ray micro-CT method to measure CCM lesion burden in mouse brains. As this study utilized a voxel dimension of 9.5μm (leading to a minimum feature size of approximately 25μm), it is therefore sufficient to measure CCM lesion volume and number globally and accurately, and provide high-resolution 3-D mapping of CCM lesions in mouse brains. Using this method, we found loss of Ccm1 or Ccm2 in neonatal endothelium confers CCM lesions in the mouse hindbrain with similar total volume and number. This quantitative approach also demonstrated a rescue of CCM lesions with simultaneous deletion of one allele of Mekk3. This method would enhance the value of the established mouse models to study the molecular basis and potential therapies for CCM and other cerebrovascular diseases. PMID:27513872

  6. Evaluation of bone formation in calcium phosphate scaffolds with μCT-method validation using SEM.

    PubMed

    Lewin, S; Barba, A; Persson, C; Franch, J; Ginebra, M-P; Öhman-Mägi, C

    2017-10-05

    There is a plethora of calcium phosphate (CaP) scaffolds used as synthetic substitutes to bone grafts. The scaffold performance is often evaluated from the quantity of bone formed within or in direct contact with the scaffold. Micro-computed tomography (μCT) allows three-dimensional evaluation of bone formation inside scaffolds. However, the almost identical x-ray attenuation of CaP and bone obtrude the separation of these phases in μCT images. Commonly, segmentation of bone in μCT images is based on gray scale intensity, with manually determined global thresholds. However, image analysis methods, and methods for manual thresholding in particular, lack standardization and may consequently suffer from subjectivity. The aim of the present study was to provide a methodological framework for addressing these issues. Bone formation in two types of CaP scaffold architectures (foamed and robocast), obtained from a larger animal study (a 12 week canine animal model) was evaluated by μCT. In addition, cross-sectional scanning electron microscopy (SEM) images were acquired as references to determine thresholds and to validate the result. μCT datasets were registered to the corresponding SEM reference. Global thresholds were then determined by quantitatively correlating the different area fractions in the μCT image, towards the area fractions in the corresponding SEM image. For comparison, area fractions were also quantified using global thresholds determined manually by two different approaches. In the validation the manually determined thresholds resulted in large average errors in area fraction (up to 17%), whereas for the evaluation using SEM references, the errors were estimated to be less than 3%. Furthermore, it was found that basing the thresholds on one single SEM reference gave lower errors than determining them manually. This study provides an objective, robust and less error prone method to determine global thresholds for the evaluation of bone formation in CaP scaffolds.

  7. Determination of somatropin charged variants by capillary zone electrophoresis - optimisation, verification and implementation of the European pharmacopoeia method.

    PubMed

    Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M

    2009-03-01

    Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.

  8. Identifiability and identification of trace continuous pollutant source.

    PubMed

    Qu, Hongquan; Liu, Shouwen; Pang, Liping; Hu, Tao

    2014-01-01

    Accidental pollution events often threaten people's health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.

  9. Quantitative Metrics for Provenance in the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Sherman, R. A.; Tipton, K.; Elamparuthy, A.

    2017-12-01

    The Global Change Information System (GCIS) is an open-source web-based resource to provide traceable provenance for government climate information, particularly the National Climate Assessment and other climate science reports from the U.S. Global Change Research Program. Since 2014, GCIS has been adding and updating information and linking records to make the system as complete as possible for the key reports. Our total count of records has grown to well over 20,000, but until recently there hasn't been an easy way to measure how well all those records were serving the mission of providing provenance. The GCIS team has recently established quantitative measures of whether each record has sufficient metadata and linkages to be useful for users of our featured climate reports. We will describe our metrics and show how they can be used to guide future development of GCIS and aid users of government climate data.

  10. Quantifying the isotopic composition of NOx emission sources: An analysis of collection methods

    NASA Astrophysics Data System (ADS)

    Fibiger, D.; Hastings, M.

    2012-04-01

    We analyze various collection methods for nitrogen oxides, NOx (NO2 and NO), used to evaluate the nitrogen isotopic composition (δ15N). Atmospheric NOx is a major contributor to acid rain deposition upon its conversion to nitric acid; it also plays a significant role in determining air quality through the production of tropospheric ozone. NOx is released by both anthropogenic (fossil fuel combustion, biomass burning, aircraft emissions) and natural (lightning, biogenic production in soils) sources. Global concentrations of NOx are rising because of increased anthropogenic emissions, while natural source emissions also contribute significantly to the global NOx burden. The contributions of both natural and anthropogenic sources and their considerable variability in space and time make it difficult to attribute local NOx concentrations (and, thus, nitric acid) to a particular source. Several recent studies suggest that variability in the isotopic composition of nitric acid deposition is related to variability in the isotopic signatures of NOx emission sources. Nevertheless, the isotopic composition of most NOx sources has not been thoroughly constrained. Ultimately, the direct capture and quantification of the nitrogen isotopic signatures of NOx sources will allow for the tracing of NOx emissions sources and their impact on environmental quality. Moreover, this will provide a new means by which to verify emissions estimates and atmospheric models. We present laboratory results of methods used for capturing NOx from air into solution. A variety of methods have been used in field studies, but no independent laboratory verification of the efficiencies of these methods has been performed. When analyzing isotopic composition, it is important that NOx be collected quantitatively or the possibility of fractionation must be constrained. We have found that collection efficiency can vary widely under different conditions in the laboratory and fractionation does not vary predictably with collection efficiency. For example, prior measurements frequently utilized triethanolamine solution for collecting NOx, but the collection efficiency was found to drop quickly as the solution aged. The most promising method tested is a NaOH/KMnO4 solution (Margeson and Knoll, Anal. Chem., 1985) which can collect NOx quantitatively from the air. Laboratory tests of previously used methods, along with progress toward creating a suitable and verifiable field deployable collection method will be presented.

  11. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  12. GOSAT/TANSO-FTS Measurement of Volcanic and Geothermal CO2 Emissions

    NASA Astrophysics Data System (ADS)

    Schwandner, Florian M.; Carn, Simon A.; Newhall, Christopher G.

    2010-05-01

    Approximately one tenth of the Earth's human population lives in direct reach of volcanic hazards. Being able to provide sufficiently early and scientifically sound warning is a key to volcanic hazard mitigation. Quantitative time-series monitoring of volcanic CO2 emissions will likely play a key role in such early warning activities in the future. Impending volcanic eruptions or any potentially disastrous activity that involves movement of magma in the subsurface, is often preceded by an early increase of CO2 emissions. Conventionally, volcanic CO2 monitoring is done either in campaigns of soil emission measurements (grid of one-time measuring points) that are labor intensive and slow, or by ground-based remote FTIR measurements in emission plumes. These methods are not easily available at all sites of potential activity and prohibitively costly to employ on a large number of volcanoes. In addition, both of these ground-based approaches pose a significant risk to the workers conducting these measurements. Some aircraft-based measurements have been conducted as well in the past, however these are limited by the usually meager funding situation of individual observatories, the hazard such flights pose to equipment and crew, and by the inaccessibility of parts of the plume due to ash hazards. The core motivation for this study is therefore to develop a method for volcanic CO2 monitoring from space that will provide sufficient coverage, resolution, and data quality for an application to quantitative time series monitoring and correlation with other available datasets, from a safe distance and with potentially global reach. In summary, the purpose of the proposed research is to quantify volcanic CO2 emissions using satellite-borne observations. Quantitative estimates will be useful for warning of impending volcanic eruptions, and assessing the contribution of volcanic CO2 to global GHG. Our approach encompasses method development and testing for the detection of volcanic CO2 anomalies using GOSAT and correlation with Aura/OMI, AIRS, and ASTER determined SO2 fluxes and ground based monitoring of CO2 and other geophysical and geochemical parameters. This will provide the ground work for future higher spatial resolution satellite missions. This is a joint effort from two GOSAT-IBUKI data application projects: "Satellite-Borne Quantification of Carbon Dioxide Emissions from Volcanoes and Geothermal Areas" (PI Schwandner), and "Application of GOSAT/TANSO-FTS to the Measurement of Volcanic CO2 Emissions" (PI Carn).

  13. Climate reconstruction analysis using coexistence likelihood estimation (CRACLE): a method for the estimation of climate using vegetation.

    PubMed

    Harbert, Robert S; Nixon, Kevin C

    2015-08-01

    • Plant distributions have long been understood to be correlated with the environmental conditions to which species are adapted. Climate is one of the major components driving species distributions. Therefore, it is expected that the plants coexisting in a community are reflective of the local environment, particularly climate.• Presented here is a method for the estimation of climate from local plant species coexistence data. The method, Climate Reconstruction Analysis using Coexistence Likelihood Estimation (CRACLE), is a likelihood-based method that employs specimen collection data at a global scale for the inference of species climate tolerance. CRACLE calculates the maximum joint likelihood of coexistence given individual species climate tolerance characterization to estimate the expected climate.• Plant distribution data for more than 4000 species were used to show that this method accurately infers expected climate profiles for 165 sites with diverse climatic conditions. Estimates differ from the WorldClim global climate model by less than 1.5°C on average for mean annual temperature and less than ∼250 mm for mean annual precipitation. This is a significant improvement upon other plant-based climate-proxy methods.• CRACLE validates long hypothesized interactions between climate and local associations of plant species. Furthermore, CRACLE successfully estimates climate that is consistent with the widely used WorldClim model and therefore may be applied to the quantitative estimation of paleoclimate in future studies. © 2015 Botanical Society of America, Inc.

  14. A Novel Triplex Quantitative PCR Strategy for Quantification of Toxigenic and Nontoxigenic Vibrio cholerae in Aquatic Environments

    PubMed Central

    Bliem, Rupert; Schauer, Sonja; Plicka, Helga; Obwaller, Adelheid; Sommer, Regina; Steinrigl, Adolf; Alam, Munirul; Reischer, Georg H.; Farnleitner, Andreas H.

    2015-01-01

    Vibrio cholerae is a severe human pathogen and a frequent member of aquatic ecosystems. Quantification of V. cholerae in environmental water samples is therefore fundamental for ecological studies and health risk assessment. Beside time-consuming cultivation techniques, quantitative PCR (qPCR) has the potential to provide reliable quantitative data and offers the opportunity to quantify multiple targets simultaneously. A novel triplex qPCR strategy was developed in order to simultaneously quantify toxigenic and nontoxigenic V. cholerae in environmental water samples. To obtain quality-controlled PCR results, an internal amplification control was included. The qPCR assay was specific, highly sensitive, and quantitative across the tested 5-log dynamic range down to a method detection limit of 5 copies per reaction. Repeatability and reproducibility were high for all three tested target genes. For environmental application, global DNA recovery (GR) rates were assessed for drinking water, river water, and water from different lakes. GR rates ranged from 1.6% to 76.4% and were dependent on the environmental background. Uncorrected and GR-corrected V. cholerae abundances were determined in two lakes with extremely high turbidity. Uncorrected abundances ranged from 4.6 × 102 to 2.3 × 104 cell equivalents liter−1, whereas GR-corrected abundances ranged from 4.7 × 103 to 1.6 × 106 cell equivalents liter−1. GR-corrected qPCR results were in good agreement with an independent cell-based direct detection method but were up to 1.6 log higher than cultivation-based abundances. We recommend the newly developed triplex qPCR strategy as a powerful tool to simultaneously quantify toxigenic and nontoxigenic V. cholerae in various aquatic environments for ecological studies as well as for risk assessment programs. PMID:25724966

  15. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  16. An analysis of Science Olympiad participants' perceptions regarding their experience with the science and engineering academic competition

    NASA Astrophysics Data System (ADS)

    Wirt, Jennifer L.

    Science education and literacy, along with a focus on the other STEM fields, have been a center of attention on the global scale for decades. The 1950's race to space is often considered the starting point. Through the years, the attention has spread to highlight the United States' scientific literacy rankings on international testing. The ever-expanding global economy and global workplace make the need for literacy in the STEM fields a necessity. Science and academic competitions are worthy of study to determine the overall and specific positive and negative aspects of their incorporation in students' educational experiences. Science Olympiad is a national science and engineering competition that engages thousands of students each year. The purpose of this study was to analyze the perceptions of Science Olympiad participants, in terms of science learning and interest, 21st century skills and abilities, perceived influence on careers, and the overall benefits of being involved in Science Olympiad. The study sought to determine if there were any differences of perception when gender was viewed as a factor. Data was acquired through the Science Olympiad survey database. It consisted of 635 usable surveys, split evenly between males and females. This study employed a mixed methods analysis. The qualitative data allowed the individual perceptions of the respondents to be highlighted and acknowledged, while the quantitative data allowed generalizations to be identified. The qualitative and quantitative data clearly showed that Science Olympiad had an impact on the career choices of participants. The qualitative data showed that participants gained an increased level of learning and interest in science and STEM areas, 21st century skills, and overall positive benefits as a result of being involved. The qualitative data was almost exclusively positive. The quantitative data however, did not capture the significance of each researched category that the qualitative anecdotal evidence depicted. The data showed that females were engaged in STEM areas when involved in Science Olympiad. Recommendations were made for further study to help delineate the data using different research questions and to further study the impact of Science Olympiad utilizing the same research questions used in this study.

  17. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  18. Abnormal EEG Power Spectra in Acute Transient Global Amnesia: A Quantitative EEG Study.

    PubMed

    Imperatori, Claudio; Farina, Benedetto; Todini, Federico; Di Blasi, Chiara; Mazzucchi, Edoardo; Brunetti, Valerio; Della Marca, Giacomo

    2018-06-01

    Transient global amnesia (TGA) is a clinical syndrome characterized by retrograde and anterograde amnesia without other neurological deficits. Although electroencephalography (EEG) methods are commonly used in both clinical and research setting with TGA patients, few studies have investigated neurophysiological pattern in TGA using quantitative EEG (qEEG). The main aim of the present study was to extend these previous findings by exploring EEG power spectra differences between patients with acute TGA and healthy controls using the exact low-resolution brain electromagnetic tomography software (eLORETA). EEG was recorded during 5 minutes of resting state. Sixteen patients (mean age: 66.81 ± 7.94 years) during acute TGA and 16 healthy subjects were enrolled. All patients showed hippocampal or parahippocampal signal abnormalities in diffusion-weighted magnetic resonance imaging performed from 2 to 5 days after the onset of TGA. Compared with healthy controls, TGA patients showed a decrease of theta power localized in the temporal lobe (Brodmann areas, BAs 21-22-38) and frontal lobe (BAs 8-9-44-45). A decrease of EEG beta power in the bilateral precuneus (BA 7) and in the bilateral postcentral gyrus (BAs 3-4-5) was also observed in TGA individuals. Taken together, our results could reflect the neurophysiological substrate of the severe impairment of both episodic memory and autobiographical memory which affect TGA patients during the acute phase.

  19. Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014.

    PubMed

    Maghami, Mohammad Reza; Asl, Shahin Navabi; Rezadad, Mohammad Esmaeil; Ale Ebrahim, Nader; Gomes, Chandima

    Solar hydrogen generation is one of the new topics in the field of renewable energy. Recently, the rate of investigation about hydrogen generation is growing dramatically in many countries. Many studies have been done about hydrogen generation from natural resources such as wind, solar, coal etc. In this work we evaluated global scientific production of solar hydrogen generation papers from 2001 to 2014 in any journal of all the subject categories of the Science Citation Index compiled by Institute for Scientific Information (ISI), Philadelphia, USA. Solar hydrogen generation was used as keywords to search the parts of titles, abstracts, or keywords. The published output analysis showed that hydrogen generation from the sun research steadily increased over the past 14 years and the annual paper production in 2013 was about three times 2010-paper production. The number of papers considered in this research is 141 which have been published from 2001 to this date. There are clear distinctions among author keywords used in publications from the five most high-publishing countries such as USA, China, Australia, Germany and India in solar hydrogen studies. In order to evaluate this work quantitative and qualitative analysis methods were used to the development of global scientific production in a specific research field. The analytical results eventually provide several key findings and consider the overview hydrogen production according to the solar hydrogen generation.

  20. Can Global Weed Assemblages Be Used to Predict Future Weeds?

    PubMed Central

    Morin, Louise; Paini, Dean R.; Randall, Roderick P.

    2013-01-01

    Predicting which plant taxa are more likely to become weeds in a region presents significant challenges to both researchers and government agencies. Often it is done in a qualitative or semi-quantitative way. In this study, we explored the potential of using the quantitative self-organising map (SOM) approach to analyse global weed assemblages and estimate likelihoods of plant taxa becoming weeds before and after they have been moved to a new region. The SOM approach examines plant taxa associations by analysing where a taxon is recorded as a weed and what other taxa are recorded as weeds in those regions. The dataset analysed was extracted from a pre-existing, extensive worldwide database of plant taxa recorded as weeds or other related status and, following reformatting, included 187 regions and 6690 plant taxa. To assess the value of the SOM approach we selected Australia as a case study. We found that the key and most important limitation in using such analytical approach lies with the dataset used. The classification of a taxon as a weed in the literature is not often based on actual data that document the economic, environmental and/or social impact of the taxon, but mostly based on human perceptions that the taxon is troublesome or simply not wanted in a particular situation. The adoption of consistent and objective criteria that incorporate a standardized approach for impact assessment of plant taxa will be necessary to develop a new global database suitable to make predictions regarding weediness using methods like SOM. It may however, be more realistic to opt for a classification system that focuses on the invasive characteristics of plant taxa without any inference to impacts, which to be defined would require some level of research to avoid bias from human perceptions and value systems. PMID:23393591

  1. Cenozoic planktonic marine diatom diversity and correlation to climate change

    USGS Publications Warehouse

    Lazarus, David; Barron, John; Renaudie, Johan; Diver, Patrick; Türke, Andreas

    2014-01-01

    Marine planktonic diatoms export carbon to the deep ocean, playing a key role in the global carbon cycle. Although commonly thought to have diversified over the Cenozoic as global oceans cooled, only two conflicting quantitative reconstructions exist, both from the Neptune deep-sea microfossil occurrences database. Total diversity shows Cenozoic increase but is sample size biased; conventional subsampling shows little net change. We calculate diversity from a separately compiled new diatom species range catalog, and recalculate Neptune subsampled-in-bin diversity using new methods to correct for increasing Cenozoic geographic endemism and decreasing Cenozoic evenness. We find coherent, substantial Cenozoic diversification in both datasets. Many living cold water species, including species important for export productivity, originate only in the latest Miocene or younger. We make a first quantitative comparison of diatom diversity to the global Cenozoic benthic ∂18O (climate) and carbon cycle records (∂13C, and 20-0 Ma pCO2). Warmer climates are strongly correlated with lower diatom diversity (raw: rho = .92, p2 were only moderately higher than today. Diversity is strongly correlated to both ∂13C and pCO2 over the last 15 my (for both: r>.9, detrended r>.6, all p<.001), but only weakly over the earlier Cenozoic, suggesting increasingly strong linkage of diatom and climate evolution in the Neogene. Our results suggest that many living marine planktonic diatom species may be at risk of extinction in future warm oceans, with an unknown but potentially substantial negative impact on the ocean biologic pump and oceanic carbon sequestration. We cannot however extrapolate our my-scale correlations with generic climate proxies to anthropogenic time-scales of warming without additional species-specific information on proximate ecologic controls.

  2. Global morphological analysis of marine viruses shows minimal regional variation and dominance of non-tailed viruses

    PubMed Central

    Brum, Jennifer R; Schenck, Ryan O; Sullivan, Matthew B

    2013-01-01

    Viruses influence oceanic ecosystems by causing mortality of microorganisms, altering nutrient and organic matter flux via lysis and auxiliary metabolic gene expression and changing the trajectory of microbial evolution through horizontal gene transfer. Limited host range and differing genetic potential of individual virus types mean that investigations into the types of viruses that exist in the ocean and their spatial distribution throughout the world's oceans are critical to understanding the global impacts of marine viruses. Here we evaluate viral morphological characteristics (morphotype, capsid diameter and tail length) using a quantitative transmission electron microscopy (qTEM) method across six of the world's oceans and seas sampled through the Tara Oceans Expedition. Extensive experimental validation of the qTEM method shows that neither sample preservation nor preparation significantly alters natural viral morphological characteristics. The global sampling analysis demonstrated that morphological characteristics did not vary consistently with depth (surface versus deep chlorophyll maximum waters) or oceanic region. Instead, temperature, salinity and oxygen concentration, but not chlorophyll a concentration, were more explanatory in evaluating differences in viral assemblage morphological characteristics. Surprisingly, given that the majority of cultivated bacterial viruses are tailed, non-tailed viruses appear to numerically dominate the upper oceans as they comprised 51–92% of the viral particles observed. Together, these results document global marine viral morphological characteristics, show that their minimal variability is more explained by environmental conditions than geography and suggest that non-tailed viruses might represent the most ecologically important targets for future research. PMID:23635867

  3. Mixed methods systematic review exploring mentorship outcomes in nursing academia.

    PubMed

    Nowell, Lorelli; Norris, Jill M; Mrklas, Kelly; White, Deborah E

    2017-03-01

    The aim of this study was to report on a mixed methods systematic review that critically examines the evidence for mentorship in nursing academia. Nursing education institutions globally have issued calls for mentorship. There is emerging evidence to support the value of mentorship in other disciplines, but the extant state of the evidence in nursing academia is not known. A comprehensive review of the evidence is required. A mixed methods systematic review. Five databases (MEDLINE, CINAHL, EMBASE, ERIC, PsycINFO) were searched using an a priori search strategy from inception to 2 November 2015 to identify quantitative, qualitative and mixed methods studies. Grey literature searches were also conducted in electronic databases (ProQuest Dissertations and Theses, Index to Theses) and mentorship conference proceedings and by hand searching the reference lists of eligible studies. Study quality was assessed prior to inclusion using standardized critical appraisal instruments from the Joanna Briggs Institute. A convergent qualitative synthesis design was used where results from qualitative, quantitative and mixed methods studies were transformed into qualitative findings. Mentorship outcomes were mapped to a theory-informed framework. Thirty-four studies were included in this review, from the 3001 records initially retrieved. In general, mentorship had a positive impact on behavioural, career, attitudinal, relational and motivational outcomes; however, the methodological quality of studies was weak. This review can inform the objectives of mentorship interventions and contribute to a more rigorous approach to studies that assess mentorship outcomes. © 2016 John Wiley & Sons Ltd.

  4. Global Warming - Are We on Thin Ice?

    NASA Technical Reports Server (NTRS)

    Tucker, Compton J.

    2007-01-01

    The evidence for global warming is very conclusive for the past 400-500 years. Prior to the 16th century, proxy surface temperature data are regionally good but lack a global distribution. The speaker will review surface temperature reconstruction based upon ice cores, coral cores, tree rings, deep sea sediments, and bore holes and discuss the controversy surrounding global warming. This will be contrasted with the excellent data we have from the satellite era of earth observations the past 30+ years that enables the quantitative study of climate across earth science disciplines.

  5. Routine Clinical Quantitative Rest Stress Myocardial Perfusion for Managing Coronary Artery Disease: Clinical Relevance of Test-Retest Variability.

    PubMed

    Kitkungvan, Danai; Johnson, Nils P; Roby, Amanda E; Patel, Monika B; Kirkeeide, Richard; Gould, K Lance

    2017-05-01

    Positron emission tomography (PET) quantifies stress myocardial perfusion (in cc/min/g) and coronary flow reserve to guide noninvasively the management of coronary artery disease. This study determined their test-retest precision within minutes and daily biological variability essential for bounding clinical decision-making or risk stratification based on low flow ischemic thresholds or follow-up changes. Randomized trials of fractional flow reserve-guided percutaneous coronary interventions established an objective, quantitative, outcomes-driven standard of physiological stenosis severity. However, pressure-derived fractional flow reserve requires invasive coronary angiogram and was originally validated by comparison to noninvasive PET. The time course and test-retest precision of serial quantitative rest-rest and stress-stress global myocardial perfusion by PET within minutes and days apart in the same patient were compared in 120 volunteers undergoing serial 708 quantitative PET perfusion scans using rubidium 82 (Rb-82) and dipyridamole stress with a 2-dimensional PET-computed tomography scanner (GE DST 16) and University of Texas HeartSee software with our validated perfusion model. Test-retest methodological precision (coefficient of variance) for serial quantitative global myocardial perfusion minutes apart is ±10% (mean ΔSD at rest ±0.09, at stress ±0.23 cc/min/g) and for days apart is ±21% (mean ΔSD at rest ±0.2, at stress ±0.46 cc/min/g) reflecting added biological variability. Global myocardial perfusion at 8 min after 4-min dipyridamole infusion is 10% higher than at standard 4 min after dipyridamole. Test-retest methodological precision of global PET myocardial perfusion by serial rest or stress PET minutes apart is ±10%. Day-to-different-day biological plus methodological variability is ±21%, thereby establishing boundaries of variability on physiological severity to guide or follow coronary artery disease management. Maximum stress increases perfusion and coronary flow reserve, thereby reducing potentially falsely low values mimicking ischemia. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  6. Use of quantitative molecular diagnostic methods to identify causes of diarrhoea in children: a reanalysis of the GEMS case-control study.

    PubMed

    Liu, Jie; Platts-Mills, James A; Juma, Jane; Kabir, Furqan; Nkeze, Joseph; Okoi, Catherine; Operario, Darwin J; Uddin, Jashim; Ahmed, Shahnawaz; Alonso, Pedro L; Antonio, Martin; Becker, Stephen M; Blackwelder, William C; Breiman, Robert F; Faruque, Abu S G; Fields, Barry; Gratz, Jean; Haque, Rashidul; Hossain, Anowar; Hossain, M Jahangir; Jarju, Sheikh; Qamar, Farah; Iqbal, Najeeha Talat; Kwambana, Brenda; Mandomando, Inacio; McMurry, Timothy L; Ochieng, Caroline; Ochieng, John B; Ochieng, Melvin; Onyango, Clayton; Panchalingam, Sandra; Kalam, Adil; Aziz, Fatima; Qureshi, Shahida; Ramamurthy, Thandavarayan; Roberts, James H; Saha, Debasish; Sow, Samba O; Stroup, Suzanne E; Sur, Dipika; Tamboura, Boubou; Taniuchi, Mami; Tennant, Sharon M; Toema, Deanna; Wu, Yukun; Zaidi, Anita; Nataro, James P; Kotloff, Karen L; Levine, Myron M; Houpt, Eric R

    2016-09-24

    Diarrhoea is the second leading cause of mortality in children worldwide, but establishing the cause can be complicated by diverse diagnostic approaches and varying test characteristics. We used quantitative molecular diagnostic methods to reassess causes of diarrhoea in the Global Enteric Multicenter Study (GEMS). GEMS was a study of moderate to severe diarrhoea in children younger than 5 years in Africa and Asia. We used quantitative real-time PCR (qPCR) to test for 32 enteropathogens in stool samples from cases and matched asymptomatic controls from GEMS, and compared pathogen-specific attributable incidences with those found with the original GEMS microbiological methods, including culture, EIA, and reverse-transcriptase PCR. We calculated revised pathogen-specific burdens of disease and assessed causes in individual children. We analysed 5304 sample pairs. For most pathogens, incidence was greater with qPCR than with the original methods, particularly for adenovirus 40/41 (around five times), Shigella spp or enteroinvasive Escherichia coli (EIEC) and Campylobactor jejuni o C coli (around two times), and heat-stable enterotoxin-producing E coli ([ST-ETEC] around 1·5 times). The six most attributable pathogens became, in descending order, Shigella spp, rotavirus, adenovirus 40/41, ST-ETEC, Cryptosporidium spp, and Campylobacter spp. Pathogen-attributable diarrhoeal burden was 89·3% (95% CI 83·2-96·0) at the population level, compared with 51·5% (48·0-55·0) in the original GEMS analysis. The top six pathogens accounted for 77·8% (74·6-80·9) of all attributable diarrhoea. With use of model-derived quantitative cutoffs to assess individual diarrhoeal cases, 2254 (42·5%) of 5304 cases had one diarrhoea-associated pathogen detected and 2063 (38·9%) had two or more, with Shigella spp and rotavirus being the pathogens most strongly associated with diarrhoea in children with mixed infections. A quantitative molecular diagnostic approach improved population-level and case-level characterisation of the causes of diarrhoea and indicated a high burden of disease associated with six pathogens, for which targeted treatment should be prioritised. Bill & Melinda Gates Foundation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Quantitative research on microscopic deformation behavior of Ti-6Al-4V two-phase titanium alloy based on finite element method

    NASA Astrophysics Data System (ADS)

    Peng, Yan; Chen, Guoxing; Sun, Jianliang; Shi, Baodong

    2018-04-01

    The microscopic deformation of Ti-6Al-4V titanium alloy shows great inhomogeneity due to its duplex-microstructure that consists of two phases. In order to study the deformation behaviors of the constituent phases, the 2D FE model based on the realistic microstructure is established by MSC.Marc nonlinear FE software, and the tensile simulation is carried out. The simulated global stress-strain response is confirmed by the tensile testing result. Then the strain and stress distribution in the constituent phases and their evolution with the increase of the global strain are analyzed. The results show that the strain and stress partitioning between the two phases are considerable, most of the strain is concentrated in soft primary α phase, while hard transformed β matrix undertakes most of the stress. Under the global strain of 0.05, the deformation bands in the direction of 45° to the stretch direction and the local stress in primary α phase near to the interface between the two phases are observed, and they become more significant when the global strain increases to 0.1. The strain and stress concentration factors of the two phases are obviously different at different macroscopic deformation stages, but they almost tend to be stable finally.

  8. Quantitative estimation of global patterns of surface ocean biological productivity and its seasonal variation on timescales from centuries to millennia

    NASA Astrophysics Data System (ADS)

    Loubere, Paul; Fariduddin, Mohammad

    1999-03-01

    We present a quantitative method, based on the relative abundances of benthic foraminifera in deep-sea sediments, for estimating surface ocean biological productivity over the timescale of centuries to millennia. We calibrate the method using a global data set composed of 207 samples from the Atlantic, Pacific, and Indian Oceans from a water depth range between 2300 and 3600 m. The sample set was developed so that other, potentially significant, environmental variables would be uncorrelated to overlying surface ocean productivity. A regression of assemblages against productivity yielded an r2 = 0.89 demonstrating a strong productivity signal in the faunal data. In addition, we examined assemblage response to annual variability in biological productivity (seasonality). Our data set included a range of seasonalities which we quantified into a seasonality index using the pigment color bands from the coastal zone color scanner (CZCS). The response of benthic foraminiferal assemblage composition to our seasonality index was tested with regression analysis. We obtained a statistically highly significant r2 = 0.75. Further, discriminant function analysis revealed a clear separation among sample groups based on surface ocean productivity and our seasonality index. Finally, we tested the response of benthic foraminiferal assemblages to three different modes of seasonality. We observed a distinct separation of our samples into groups representing low seasonal variability, strong seasonality with a single main productivity event in the year, and strong seasonality with multiple productivity events in the year. Reconstructing surface ocean biological productivity with benthic foraminifera will aid in modeling marine biogeochemical cycles. Also, estimating mode and range of annual seasonality will provide insight to changing oceanic processes, allowing the examination of the mechanisms causing changes in the marine biotic system over time. This article contains supplementary material.

  9. 77 FR 42052 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; NYSE Arca, Inc.; Order Instituting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... proposed MQP would be a program designed to promote market quality in certain securities listed on NASDAQ... contain the specific quantitative listing requirements for listing on the Global Select, Global Market... pilot (for comparative purposes), volume metrics, NBBO bid/ask spread differentials, LMM participation...

  10. Cortical hypometabolism and hypoperfusion in Parkinson's disease is extensive: probably even at early disease stages.

    PubMed

    Borghammer, Per; Chakravarty, Mallar; Jonsdottir, Kristjana Yr; Sato, Noriko; Matsuda, Hiroshi; Ito, Kengo; Arahata, Yutaka; Kato, Takashi; Gjedde, Albert

    2010-05-01

    Recent cerebral blood flow (CBF) and glucose consumption (CMRglc) studies of Parkinson's disease (PD) revealed conflicting results. Using simulated data, we previously demonstrated that the often-reported subcortical hypermetabolism in PD could be explained as an artifact of biased global mean (GM) normalization, and that low-magnitude, extensive cortical hypometabolism is best detected by alternative data-driven normalization methods. Thus, we hypothesized that PD is characterized by extensive cortical hypometabolism but no concurrent widespread subcortical hypermetabolism and tested it on three independent samples of PD patients. We compared SPECT CBF images of 32 early-stage and 33 late-stage PD patients with that of 60 matched controls. We also compared PET FDG images from 23 late-stage PD patients with that of 13 controls. Three different normalization methods were compared: (1) GM normalization, (2) cerebellum normalization, (3) reference cluster normalization (Yakushev et al.). We employed standard voxel-based statistics (fMRIstat) and principal component analysis (SSM). Additionally, we performed a meta-analysis of all quantitative CBF and CMRglc studies in the literature to investigate whether the global mean (GM) values in PD are decreased. Voxel-based analysis with GM normalization and the SSM method performed similarly, i.e., both detected decreases in small cortical clusters and concomitant increases in extensive subcortical regions. Cerebellum normalization revealed more widespread cortical decreases but no subcortical increase. In all comparisons, the Yakushev method detected nearly identical patterns of very extensive cortical hypometabolism. Lastly, the meta-analyses demonstrated that global CBF and CMRglc values are decreased in PD. Based on the results, we conclude that PD most likely has widespread cortical hypometabolism, even at early disease stages. In contrast, extensive subcortical hypermetabolism is probably not a feature of PD.

  11. Identifying sources of fugitive emissions in industrial facilities using trajectory statistical methods

    NASA Astrophysics Data System (ADS)

    Brereton, Carol A.; Johnson, Matthew R.

    2012-05-01

    Fugitive pollutant sources from the oil and gas industry are typically quite difficult to find within industrial plants and refineries, yet they are a significant contributor of global greenhouse gas emissions. A novel approach for locating fugitive emission sources using computationally efficient trajectory statistical methods (TSM) has been investigated in detailed proof-of-concept simulations. Four TSMs were examined in a variety of source emissions scenarios developed using transient CFD simulations on the simplified geometry of an actual gas plant: potential source contribution function (PSCF), concentration weighted trajectory (CWT), residence time weighted concentration (RTWC), and quantitative transport bias analysis (QTBA). Quantitative comparisons were made using a correlation measure based on search area from the source(s). PSCF, CWT and RTWC could all distinguish areas near major sources from the surroundings. QTBA successfully located sources in only some cases, even when provided with a large data set. RTWC, given sufficient domain trajectory coverage, distinguished source areas best, but otherwise could produce false source predictions. Using RTWC in conjunction with CWT could overcome this issue as well as reduce sensitivity to noise in the data. The results demonstrate that TSMs are a promising approach for identifying fugitive emissions sources within complex facility geometries.

  12. Physics-Based Image Segmentation Using First Order Statistical Properties and Genetic Algorithm for Inductive Thermography Imaging.

    PubMed

    Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun

    2018-05-01

    Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.

  13. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  14. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  15. High-fidelity target sequencing of individual molecules identified using barcode sequences: de novo detection and absolute quantitation of mutations in plasma cell-free DNA from cancer patients.

    PubMed

    Kukita, Yoji; Matoba, Ryo; Uchida, Junji; Hamakawa, Takuya; Doki, Yuichiro; Imamura, Fumio; Kato, Kikuya

    2015-08-01

    Circulating tumour DNA (ctDNA) is an emerging field of cancer research. However, current ctDNA analysis is usually restricted to one or a few mutation sites due to technical limitations. In the case of massively parallel DNA sequencers, the number of false positives caused by a high read error rate is a major problem. In addition, the final sequence reads do not represent the original DNA population due to the global amplification step during the template preparation. We established a high-fidelity target sequencing system of individual molecules identified in plasma cell-free DNA using barcode sequences; this system consists of the following two steps. (i) A novel target sequencing method that adds barcode sequences by adaptor ligation. This method uses linear amplification to eliminate the errors introduced during the early cycles of polymerase chain reaction. (ii) The monitoring and removal of erroneous barcode tags. This process involves the identification of individual molecules that have been sequenced and for which the number of mutations have been absolute quantitated. Using plasma cell-free DNA from patients with gastric or lung cancer, we demonstrated that the system achieved near complete elimination of false positives and enabled de novo detection and absolute quantitation of mutations in plasma cell-free DNA. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  16. Multiplex real-time PCR monitoring of intestinal helminths in humans reveals widespread polyparasitism in Northern Samar, the Philippines.

    PubMed

    Gordon, Catherine A; McManus, Donald P; Acosta, Luz P; Olveda, Remigio M; Williams, Gail M; Ross, Allen G; Gray, Darren J; Gobert, Geoffrey N

    2015-06-01

    The global socioeconomic importance of helminth parasitic disease is underpinned by the considerable clinical impact on millions of people. While helminth polyparasitism is considered common in the Philippines, little has been done to survey its extent in endemic communities. High morphological similarity of eggs between related species complicates conventional microscopic diagnostic methods which are known to lack sensitivity, particularly in low intensity infections. Multiplex quantitative PCR diagnostic methods can provide rapid, simultaneous identification of multiple helminth species from a single stool sample. We describe a multiplex assay for the differentiation of Ascaris lumbricoides, Necator americanus, Ancylostoma, Taenia saginata and Taenia solium, building on our previously published findings for Schistosoma japonicum. Of 545 human faecal samples examined, 46.6% were positive for at least three different parasite species. High prevalences of S. japonicum (90.64%), A. lumbricoides (58.17%), T. saginata (42.57%) and A. duodenale (48.07%) were recorded. Neither T. solium nor N. americanus were found to be present. The utility of molecular diagnostic methods for monitoring helminth parasite prevalence provides new information on the extent of polyparasitism in the Philippines municipality of Palapag. These methods and findings have potential global implications for the monitoring of neglected tropical diseases and control measures. Copyright © 2015 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  17. Shuttle Entry Imaging Using Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Berry, Scott; Alter, Stephen; Blanchard, Robert; Schwartz, Richard; Ross, Martin; Tack, Steve

    2007-01-01

    During the Columbia Accident Investigation, imaging teams supporting debris shedding analysis were hampered by poor entry image quality and the general lack of information on optical signatures associated with a nominal Shuttle entry. After the accident, recommendations were made to NASA management to develop and maintain a state-of-the-art imagery database for Shuttle engineering performance assessments and to improve entry imaging capability to support anomaly and contingency analysis during a mission. As a result, the Space Shuttle Program sponsored an observation campaign to qualitatively characterize a nominal Shuttle entry over the widest possible Mach number range. The initial objectives focused on an assessment of capability to identify/resolve debris liberated from the Shuttle during entry, characterization of potential anomalous events associated with RCS jet firings and unusual phenomenon associated with the plasma trail. The aeroheating technical community viewed the Space Shuttle Program sponsored activity as an opportunity to influence the observation objectives and incrementally demonstrate key elements of a quantitative spatially resolved temperature measurement capability over a series of flights. One long-term desire of the Shuttle engineering community is to calibrate boundary layer transition prediction methodologies that are presently part of the Shuttle damage assessment process using flight data provided by a controlled Shuttle flight experiment. Quantitative global imaging may offer a complementary method of data collection to more traditional methods such as surface thermocouples. This paper reviews the process used by the engineering community to influence data collection methods and analysis of global infrared images of the Shuttle obtained during hypersonic entry. Emphasis is placed upon airborne imaging assets sponsored by the Shuttle program during Return to Flight. Visual and IR entry imagery were obtained with available airborne imaging platforms used within DoD along with agency assets developed and optimized for use during Shuttle ascent to demonstrate capability (i.e., tracking, acquisition of multispectral data, spatial resolution) and identify system limitations (i.e., radiance modeling, saturation) using state-of-the-art imaging instrumentation and communication systems. Global infrared intensity data have been transformed to temperature by comparison to Shuttle flight thermocouple data. Reasonable agreement is found between the flight thermography images and numerical prediction. A discussion of lessons learned and potential application to a potential Shuttle boundary layer transition flight test is presented.

  18. Emerging methods to study bacteriophage infection at the single-cell level.

    PubMed

    Dang, Vinh T; Sullivan, Matthew B

    2014-01-01

    Bacteria and their viruses (phages) are abundant across diverse ecosystems and their interactions influence global biogeochemical cycles and incidence of disease. Problematically, both classical and metagenomic methods insufficiently assess the host specificity of phages and phage-host infection dynamics in nature. Here we review emerging methods to study phage-host interaction and infection dynamics with a focus on those that offer resolution at the single-cell level. These methods leverage ever-increasing sequence data to identify virus signals from single-cell amplified genome datasets or to produce primers/probes to target particular phage-bacteria pairs (digital PCR and phageFISH), even in complex communities. All three methods enable study of phage infection of uncultured bacteria from environmental samples, while the latter also discriminates between phage-host interaction outcomes (e.g., lytic, chronic, lysogenic) in model systems. Together these techniques enable quantitative, spatiotemporal studies of phage-bacteria interactions from environmental samples of any ecosystem, which will help elucidate and predict the ecological and evolutionary impacts of specific phage-host pairings in nature.

  19. Advancing methods for research on household water insecurity: Studying entitlements and capabilities, socio-cultural dynamics, and political processes, institutions and governance.

    PubMed

    Wutich, Amber; Budds, Jessica; Eichelberger, Laura; Geere, Jo; Harris, Leila; Horney, Jennifer; Jepson, Wendy; Norman, Emma; O'Reilly, Kathleen; Pearson, Amber; Shah, Sameer; Shinn, Jamie; Simpson, Karen; Staddon, Chad; Stoler, Justin; Teodoro, Manuel P; Young, Sera

    2017-11-01

    Household water insecurity has serious implications for the health, livelihoods and wellbeing of people around the world. Existing methods to assess the state of household water insecurity focus largely on water quality, quantity or adequacy, source or reliability, and affordability. These methods have significant advantages in terms of their simplicity and comparability, but are widely recognized to oversimplify and underestimate the global burden of household water insecurity. In contrast, a broader definition of household water insecurity should include entitlements and human capabilities, sociocultural dynamics, and political institutions and processes. This paper proposes a mix of qualitative and quantitative methods that can be widely adopted across cultural, geographic, and demographic contexts to assess hard-to-measure dimensions of household water insecurity. In doing so, it critically evaluates existing methods for assessing household water insecurity and suggests ways in which methodological innovations advance a broader definition of household water insecurity.

  20. A Global Sensitivity Analysis Method on Maximum Tsunami Wave Heights to Potential Seismic Source Parameters

    NASA Astrophysics Data System (ADS)

    Ren, Luchuan

    2015-04-01

    A Global Sensitivity Analysis Method on Maximum Tsunami Wave Heights to Potential Seismic Source Parameters Luchuan Ren, Jianwei Tian, Mingli Hong Institute of Disaster Prevention, Sanhe, Heibei Province, 065201, P.R. China It is obvious that the uncertainties of the maximum tsunami wave heights in offshore area are partly from uncertainties of the potential seismic tsunami source parameters. A global sensitivity analysis method on the maximum tsunami wave heights to the potential seismic source parameters is put forward in this paper. The tsunami wave heights are calculated by COMCOT ( the Cornell Multi-grid Coupled Tsunami Model), on the assumption that an earthquake with magnitude MW8.0 occurred at the northern fault segment along the Manila Trench and triggered a tsunami in the South China Sea. We select the simulated results of maximum tsunami wave heights at specific sites in offshore area to verify the validity of the method proposed in this paper. For ranking importance order of the uncertainties of potential seismic source parameters (the earthquake's magnitude, the focal depth, the strike angle, dip angle and slip angle etc..) in generating uncertainties of the maximum tsunami wave heights, we chose Morris method to analyze the sensitivity of the maximum tsunami wave heights to the aforementioned parameters, and give several qualitative descriptions of nonlinear or linear effects of them on the maximum tsunami wave heights. We quantitatively analyze the sensitivity of the maximum tsunami wave heights to these parameters and the interaction effects among these parameters on the maximum tsunami wave heights by means of the extended FAST method afterward. The results shows that the maximum tsunami wave heights are very sensitive to the earthquake magnitude, followed successively by the epicenter location, the strike angle and dip angle, the interactions effect between the sensitive parameters are very obvious at specific site in offshore area, and there exist differences in importance order in generating uncertainties of the maximum tsunami wave heights for same group parameters at different specific sites in offshore area. These results are helpful to deeply understand the relationship between the tsunami wave heights and the seismic tsunami source parameters. Keywords: Global sensitivity analysis; Tsunami wave height; Potential seismic tsunami source parameter; Morris method; Extended FAST method

  1. Occupational risk assessment in the construction industry in Iran.

    PubMed

    Seifi Azad Mard, Hamid Reza; Estiri, Ali; Hadadi, Parinaz; Seifi Azad Mard, Mahshid

    2017-12-01

    Occupational accidents in the construction industry are more common compared with other fields and these accidents are more severe compared with the global average in developing countries, especially in Iran. Studies which lead to the source of these accidents and suggest solutions for them are therefore valuable. In this study a combination of the failure mode and effects analysis method and fuzzy theory is used as a semi-qualitative-quantitative method for analyzing risks and failure modes. The main causes of occupational accidents in this field were identified and analyzed based on three factors; severity, detection and occurrence. Based on whether the risks are high or low priority, modifying actions were suggested to reduce the occupational risks. Finally, the results showed that high priority risks had a 40% decrease due to these actions.

  2. Local Descriptors of Dynamic and Nondynamic Correlation.

    PubMed

    Ramos-Cordoba, Eloy; Matito, Eduard

    2017-06-13

    Quantitatively accurate electronic structure calculations rely on the proper description of electron correlation. A judicious choice of the approximate quantum chemistry method depends upon the importance of dynamic and nondynamic correlation, which is usually assesed by scalar measures. Existing measures of electron correlation do not consider separately the regions of the Cartesian space where dynamic or nondynamic correlation are most important. We introduce real-space descriptors of dynamic and nondynamic electron correlation that admit orbital decomposition. Integration of the local descriptors yields global numbers that can be used to quantify dynamic and nondynamic correlation. Illustrative examples over different chemical systems with varying electron correlation regimes are used to demonstrate the capabilities of the local descriptors. Since the expressions only require orbitals and occupation numbers, they can be readily applied in the context of local correlation methods, hybrid methods, density matrix functional theory, and fractional-occupancy density functional theory.

  3. Quantitative developmental transcriptomes of the Mediterranean sea urchin Paracentrotus lividus.

    PubMed

    Gildor, Tsvia; Malik, Assaf; Sher, Noa; Avraham, Linor; Ben-Tabou de-Leon, Smadar

    2016-02-01

    Embryonic development progresses through the timely activation of thousands of differentially activated genes. Quantitative developmental transcriptomes provide the means to relate global patterns of differentially expressed genes to the emerging body plans they generate. The sea urchin is one of the classic model systems for embryogenesis and the models of its developmental gene regulatory networks are of the most comprehensive of their kind. Thus, the sea urchin embryo is an excellent system for studies of its global developmental transcriptional profiles. Here we produced quantitative developmental transcriptomes of the sea urchin Paracentrotus lividus (P. lividus) at seven developmental stages from the fertilized egg to prism stage. We generated de-novo reference transcriptome and identified 29,817 genes that are expressed at this time period. We annotated and quantified gene expression at the different developmental stages and confirmed the reliability of the expression profiles by QPCR measurement of a subset of genes. The progression of embryo development is reflected in the observed global expression patterns and in our principle component analysis. Our study illuminates the rich patterns of gene expression that participate in sea urchin embryogenesis and provide an essential resource for further studies of the dynamic expression of P. lividus genes. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Quantitative Profiling of Feruloylated Arabinoxylan Side-Chains from Graminaceous Cell Walls

    PubMed Central

    Schendel, Rachel R.; Meyer, Marleen R.; Bunzel, Mirko

    2016-01-01

    Graminaceous arabinoxylans are distinguished by decoration with feruloylated monosaccharidic and oligosaccharidic side-chains. Although it is hypothesized that structural complexity and abundance of these feruloylated arabinoxylan side-chains may contribute, among other factors, to resistance of plant cell walls to enzymatic degradation, quantitative profiling approaches for these structural units in plant cell wall materials have not been described yet. Here we report the development and application of a rapid and robust method enabling the quantitative comparison of feruloylated side-chain profiles in cell wall materials following mildly acidic hydrolysis, C18-solid phase extraction (SPE), reduction under aprotic conditions, and liquid chromatography with diode-array detection/mass spectrometry (LC-DAD/MS) separation and detection. The method was applied to the insoluble fiber/cell wall materials isolated from 12 whole grains: wild rice (Zizania aquatica L.), long-grain brown rice (Oryza sativa L.), rye (Secale cereale L.), kamut (Triticum turanicum Jakubz.), wheat (Triticum aestivum L.), spelt (Triticum spelta L.), intermediate wheatgrass (Thinopyrum intermedium), maize (Zea mays L.), popcorn (Zea mays L. var. everta), oat (Avena sativa L.) (dehulled), barley (Hordeum vulgare L.) (dehulled), and proso millet (Panicum miliaceum L.). Between 51 and 96% of the total esterified monomeric ferulates were represented in the quantified compounds captured in the feruloylated side-chain profiles, which confirms the significance of these structures to the global arabinoxylan structure in terms of quantity. The method provided new structural insights into cereal grain arabinoxylans, in particular, that the structural moiety α-l-galactopyranosyl-(1→2)-β-d-xylopyranosyl-(1→2)-5-O-trans-feruloyl-l-arabinofuranose (FAXG), which had previously only been described in maize, is ubiquitous to cereal grains. PMID:26834763

  5. Establishment and validation of analytical reference panels for the standardization of quantitative BCR-ABL1 measurements on the international scale.

    PubMed

    White, Helen E; Hedges, John; Bendit, Israel; Branford, Susan; Colomer, Dolors; Hochhaus, Andreas; Hughes, Timothy; Kamel-Reid, Suzanne; Kim, Dong-Wook; Modur, Vijay; Müller, Martin C; Pagnano, Katia B; Pane, Fabrizio; Radich, Jerry; Cross, Nicholas C P; Labourier, Emmanuel

    2013-06-01

    Current guidelines for managing Philadelphia-positive chronic myeloid leukemia include monitoring the expression of the BCR-ABL1 (breakpoint cluster region/c-abl oncogene 1, non-receptor tyrosine kinase) fusion gene by quantitative reverse-transcription PCR (RT-qPCR). Our goal was to establish and validate reference panels to mitigate the interlaboratory imprecision of quantitative BCR-ABL1 measurements and to facilitate global standardization on the international scale (IS). Four-level secondary reference panels were manufactured under controlled and validated processes with synthetic Armored RNA Quant molecules (Asuragen) calibrated to reference standards from the WHO and the NIST. Performance was evaluated in IS reference laboratories and with non-IS-standardized RT-qPCR methods. For most methods, percent ratios for BCR-ABL1 e13a2 and e14a2 relative to ABL1 or BCR were robust at 4 different levels and linear over 3 logarithms, from 10% to 0.01% on the IS. The intraassay and interassay imprecision was <2-fold overall. Performance was stable across 3 consecutive lots, in multiple laboratories, and over a period of 18 months to date. International field trials demonstrated the commutability of the reagents and their accurate alignment to the IS within the intra- and interlaboratory imprecision of IS-standardized methods. The synthetic calibrator panels are robust, reproducibly manufactured, analytically calibrated to the WHO primary standards, and compatible with most BCR-ABL1 RT-qPCR assay designs. The broad availability of secondary reference reagents will further facilitate interlaboratory comparative studies and independent quality assessment programs, which are of paramount importance for worldwide standardization of BCR-ABL1 monitoring results and the optimization of current and new therapeutic approaches for chronic myeloid leukemia. © 2013 American Association for Clinical Chemistry.

  6. Methods for assessing geodiversity

    NASA Astrophysics Data System (ADS)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2017-04-01

    The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.

  7. NetMHCpan, a Method for Quantitative Predictions of Peptide Binding to Any HLA-A and -B Locus Protein of Known Sequence

    PubMed Central

    Nielsen, Morten; Lundegaard, Claus; Blicher, Thomas; Lamberth, Kasper; Harndahl, Mikkel; Justesen, Sune; Røder, Gustav; Peters, Bjoern; Sette, Alessandro; Lund, Ole; Buus, Søren

    2007-01-01

    Background Binding of peptides to Major Histocompatibility Complex (MHC) molecules is the single most selective step in the recognition of pathogens by the cellular immune system. The human MHC class I system (HLA-I) is extremely polymorphic. The number of registered HLA-I molecules has now surpassed 1500. Characterizing the specificity of each separately would be a major undertaking. Principal Findings Here, we have drawn on a large database of known peptide-HLA-I interactions to develop a bioinformatics method, which takes both peptide and HLA sequence information into account, and generates quantitative predictions of the affinity of any peptide-HLA-I interaction. Prospective experimental validation of peptides predicted to bind to previously untested HLA-I molecules, cross-validation, and retrospective prediction of known HIV immune epitopes and endogenous presented peptides, all successfully validate this method. We further demonstrate that the method can be applied to perform a clustering analysis of MHC specificities and suggest using this clustering to select particularly informative novel MHC molecules for future biochemical and functional analysis. Conclusions Encompassing all HLA molecules, this high-throughput computational method lends itself to epitope searches that are not only genome- and pathogen-wide, but also HLA-wide. Thus, it offers a truly global analysis of immune responses supporting rational development of vaccines and immunotherapy. It also promises to provide new basic insights into HLA structure-function relationships. The method is available at http://www.cbs.dtu.dk/services/NetMHCpan. PMID:17726526

  8. Comparison of analytical methods of brain [18F]FDG-PET after severe traumatic brain injury.

    PubMed

    Madsen, Karine; Hesby, Sara; Poulsen, Ingrid; Fuglsang, Stefan; Graff, Jesper; Larsen, Karen B; Kammersgaard, Lars P; Law, Ian; Siebner, Hartwig R

    2017-11-01

    Loss of consciousness has been shown to reduce cerebral metabolic rates of glucose (CMRglc) measured by brain [ 18 F]FDG-PET. Measurements of regional metabolic patterns by normalization to global cerebral metabolism or cerebellum may underestimate widespread reductions. The aim of this study was to compare quantification methods of whole brain glucose metabolism, including whole brain [18F]FDG uptake normalized to uptake in cerebellum, normalized to injected activity, normalized to plasma tracer concentration, and two methods for estimating CMRglc. Six patients suffering from severe traumatic brain injury (TBI) and ten healthy controls (HC) underwent a 10min static [ 18 F]FDG-PET scan and venous blood sampling. Except from normalizing to cerebellum, all quantification methods found significant lower level of whole brain glucose metabolism of 25-33% in TBI patients compared to HC. In accordance these measurements correlated to level of consciousness. Our study demonstrates that the analysis method of the [ 18 F]FDG PET data has a substantial impact on the estimated whole brain cerebral glucose metabolism in patients with severe TBI. Importantly, the SUVR method which is often used in a clinical setting was not able to distinguish patients with severe TBI from HC at the whole-brain level. We recommend supplementing a static [ 18 F]FDG scan with a single venous blood sample in future studies of patients with severe TBI or reduced level of consciousness. This can be used for simple semi-quantitative uptake values by normalizing brain activity uptake to plasma tracer concentration, or quantitative estimates of CMRglc. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Enhanced contractility of intraparenchymal arterioles after global cerebral ischaemia in rat - new insights into the development of delayed cerebral hypoperfusion.

    PubMed

    Spray, S; Johansson, S E; Radziwon-Balicka, A; Haanes, K A; Warfvinge, K; Povlsen, G K; Kelly, P A T; Edvinsson, L

    2017-08-01

    Delayed cerebral hypoperfusion is a secondary complication found in the days after transient global cerebral ischaemia that worsens the ischaemic damage inflicted by the initial transient episode of global cerebral ischaemia. A recent study demonstrated increased cerebral vasoconstriction in the large arteries on the brain surface (pial arteries) after global cerebral ischaemia. However, smaller arterioles inside the brain (parenchymal arterioles) are equally important in the regulation of cerebral blood flow and yet their pathophysiology after global cerebral ischaemia is largely unknown. Therefore, we investigated whether increased contractility occurs in the intraparenchymal arterioles. Global cerebral ischaemia was induced in male Wistar rats by bilateral common carotid occlusion for 15 min combined with hypovolaemia. Regional cerebral blood flow was determined by quantitative autoradiography. Intraparenchymal arterioles were isolated and pressurized, and concentration-response curves to endothelin-1 with and without the endothelin B receptor-selective antagonist BQ788 was generated. Endothelin B receptor expression was investigated by quantitative flow cytometry and immunohistochemistry. We observed increased endothelin-1-mediated contractility of parenchymal arterioles correlating with reduced cerebral blood flow of the cortex, hippocampus and caudate nucleus 48 h after global cerebral ischaemia. The increased endothelin-1-mediated contractility was abolished by BQ788, and the vascular smooth muscle cell-specific expression of endothelin B receptors was significantly increased after global cerebral ischaemia. Increased endothelin-1-mediated contractility and expression of endothelin B receptors in the intraparenchymal vasculature contributes to the development of delayed cerebral hypoperfusion after global cerebral ischaemia in combination with vascular changes of the pial vasculature. © 2016 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  10. Characterization and quantitative analysis of phenylpropanoid amides in eggplant (Solanum melongena L.) by high performance liquid chromatography coupled with diode array detection and hybrid ion trap time-of-flight mass spectrometry.

    PubMed

    Sun, Jing; Song, Yue-Lin; Zhang, Jing; Huang, Zheng; Huo, Hui-Xia; Zheng, Jiao; Zhang, Qian; Zhao, Yun-Fang; Li, Jun; Tu, Peng-Fei

    2015-04-08

    Eggplant (Solanum melongena L.) is a famous edible and medicinal plant. Despite being widely cultivated and used, data on certain parts other than the fruit are limited. The present study focused on the qualitative and quantitative analysis of the chemical constituents, particularly phenylpropanoid amides (PAs), in eggplant. The mass fragmentation patterns of PAs were proposed using seven authentic compounds with the assistance of a hybrid ion trap time-of-flight mass spectrometer. Thirty-seven compounds (27 PAs and 10 others) were detected and plausibly assigned in the different parts of eggplant. Afterward, a reliable method based on liquid chromatography coupled with diode array detection was developed, validated, and applied for the simultaneous determination of seven PAs and three caffeoylquinic acids in 17 batches of eggplant roots with satisfactory accuracy, precision, and reproducibility, which could not only provide global chemical insight of eggplant but also offer a reliable tool for quality control.

  11. SVM-Based Synthetic Fingerprint Discrimination Algorithm and Quantitative Optimization Strategy

    PubMed Central

    Chen, Suhang; Chang, Sheng; Huang, Qijun; He, Jin; Wang, Hao; Huang, Qiangui

    2014-01-01

    Synthetic fingerprints are a potential threat to automatic fingerprint identification systems (AFISs). In this paper, we propose an algorithm to discriminate synthetic fingerprints from real ones. First, four typical characteristic factors—the ridge distance features, global gray features, frequency feature and Harris Corner feature—are extracted. Then, a support vector machine (SVM) is used to distinguish synthetic fingerprints from real fingerprints. The experiments demonstrate that this method can achieve a recognition accuracy rate of over 98% for two discrete synthetic fingerprint databases as well as a mixed database. Furthermore, a performance factor that can evaluate the SVM's accuracy and efficiency is presented, and a quantitative optimization strategy is established for the first time. After the optimization of our synthetic fingerprint discrimination task, the polynomial kernel with a training sample proportion of 5% is the optimized value when the minimum accuracy requirement is 95%. The radial basis function (RBF) kernel with a training sample proportion of 15% is a more suitable choice when the minimum accuracy requirement is 98%. PMID:25347063

  12. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  13. Application of differential evolution algorithm on self-potential data.

    PubMed

    Li, Xiangtao; Yin, Minghao

    2012-01-01

    Differential evolution (DE) is a population based evolutionary algorithm widely used for solving multidimensional global optimization problems over continuous spaces, and has been successfully used to solve several kinds of problems. In this paper, differential evolution is used for quantitative interpretation of self-potential data in geophysics. Six parameters are estimated including the electrical dipole moment, the depth of the source, the distance from the origin, the polarization angle and the regional coefficients. This study considers three kinds of data from Turkey: noise-free data, contaminated synthetic data, and Field example. The differential evolution and the corresponding model parameters are constructed as regards the number of the generations. Then, we show the vibration of the parameters at the vicinity of the low misfit area. Moreover, we show how the frequency distribution of each parameter is related to the number of the DE iteration. Experimental results show the DE can be used for solving the quantitative interpretation of self-potential data efficiently compared with previous methods.

  14. Application of Differential Evolution Algorithm on Self-Potential Data

    PubMed Central

    Li, Xiangtao; Yin, Minghao

    2012-01-01

    Differential evolution (DE) is a population based evolutionary algorithm widely used for solving multidimensional global optimization problems over continuous spaces, and has been successfully used to solve several kinds of problems. In this paper, differential evolution is used for quantitative interpretation of self-potential data in geophysics. Six parameters are estimated including the electrical dipole moment, the depth of the source, the distance from the origin, the polarization angle and the regional coefficients. This study considers three kinds of data from Turkey: noise-free data, contaminated synthetic data, and Field example. The differential evolution and the corresponding model parameters are constructed as regards the number of the generations. Then, we show the vibration of the parameters at the vicinity of the low misfit area. Moreover, we show how the frequency distribution of each parameter is related to the number of the DE iteration. Experimental results show the DE can be used for solving the quantitative interpretation of self-potential data efficiently compared with previous methods. PMID:23240004

  15. Global Multiculturalism in Undergraduate Sociology Course: An Analysis of Introductory Textbooks in the U.S.

    ERIC Educational Resources Information Center

    Shin, Kyoung-Ho

    2014-01-01

    Focusing on global multiculturalism, this study offers a content review of undergraduate introductory sociology textbooks that are being used in the United States. Through both quantitative and qualitative review of texts, it is found that there is a considerable variation in the way which concepts and latent meanings are embedded within textbook…

  16. Public Constructs of Energy Values and Behaviors in Implementing Taiwan's "Energy-Conservation/Carbon-Reduction" Declarations

    ERIC Educational Resources Information Center

    Chiu, Mei-Shiu; Yeh, Huei-Ming; Spangler, Jonathan

    2016-01-01

    The emergent crisis of global warming calls for energy education for people of all ages and social groups. The Taiwanese government has publicized 10 declarations on energy conservation and carbon reduction as public behavior guidelines to mitigate global warming. This study uses interviews with quantitative assessment to explore the values and…

  17. "An Inconvenient Truth"--Is It Still Effective at Familiarizing Students with Global Warming?

    ERIC Educational Resources Information Center

    Griep, Mark A.; Reimer, Kaitlin

    2016-01-01

    Chemistry courses for nonscience majors emphasize chemical concepts and the relationship of chemical knowledge to everyday life while teaching the utility of quantitative analysis. As an introduction to the topic of global warming, the first half of "An Inconvenient Truth," released in 2006, has been shown annually since 2008 in the…

  18. A Case-Controlled Investigation of Tactile Reactivity in Young Children with and without Global Developmental Delay

    ERIC Educational Resources Information Center

    Barney, Chantel C.; Tervo, Raymond; Wilcox, George L.; Symons, Frank J.

    2017-01-01

    Assessing tactile function among children with intellectual, motor, and communication impairments remains a clinical challenge. A case control design was used to test whether children with global developmental delays (GDD; n = 20) would be more/less reactive to a modified quantitative sensory test (mQST) compared to controls (n = 20). Reactivity…

  19. "Going Glocal": A Qualitative and Quantitative Analysis of Global Citizenship Education at a Dutch Liberal Arts and Sciences College

    ERIC Educational Resources Information Center

    Sklad, M.; Friedman, J.; Park, E.; Oomen, B.

    2016-01-01

    Over the past decades, more and more institutions of higher learning have developed programs destined to educate students for global citizenship. Such efforts pose considerable challenges: conceptually, pedagogically and from the perspective of impact assessment. Conceptually, it is of utmost importance to pay attention to both structural…

  20. Quantifying and Comparing Effects of Climate Engineering Methods on the Earth System

    NASA Astrophysics Data System (ADS)

    Sonntag, Sebastian; Ferrer González, Miriam; Ilyina, Tatiana; Kracher, Daniela; Nabel, Julia E. M. S.; Niemeier, Ulrike; Pongratz, Julia; Reick, Christian H.; Schmidt, Hauke

    2018-02-01

    To contribute to a quantitative comparison of climate engineering (CE) methods, we assess atmosphere-, ocean-, and land-based CE measures with respect to Earth system effects consistently within one comprehensive model. We use the Max Planck Institute Earth System Model (MPI-ESM) with prognostic carbon cycle to compare solar radiation management (SRM) by stratospheric sulfur injection and two carbon dioxide removal methods: afforestation and ocean alkalinization. The CE model experiments are designed to offset the effect of fossil-fuel burning on global mean surface air temperature under the RCP8.5 scenario to follow or get closer to the RCP4.5 scenario. Our results show the importance of feedbacks in the CE effects. For example, as a response to SRM the land carbon uptake is enhanced by 92 Gt by the year 2100 compared to the reference RCP8.5 scenario due to reduced soil respiration thus reducing atmospheric CO2. Furthermore, we show that normalizations allow for a better comparability of different CE methods. For example, we find that due to compensating processes such as biogeophysical effects of afforestation more carbon needs to be removed from the atmosphere by afforestation than by alkalinization to reach the same global warming reduction. Overall, we illustrate how different CE methods affect the components of the Earth system; we identify challenges arising in a CE comparison, and thereby contribute to developing a framework for a comparative assessment of CE.

  1. Ionospheric Simulation System for Satellite Observations and Global Assimilative Modeling Experiments (ISOGAME)

    NASA Technical Reports Server (NTRS)

    Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.

    2013-01-01

    ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.

  2. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Scout-view Assisted Interior Micro-CT

    PubMed Central

    Sen Sharma, Kriti; Holzner, Christian; Vasilescu, Dragoş M.; Jin, Xin; Narayanan, Shree; Agah, Masoud; Hoffman, Eric A.; Yu, Hengyong; Wang, Ge

    2013-01-01

    Micro computed tomography (micro-CT) is a widely-used imaging technique. A challenge of micro-CT is to quantitatively reconstruct a sample larger than the field-of-view (FOV) of the detector. This scenario is characterized by truncated projections and associated image artifacts. However, for such truncated scans, a low resolution scout scan with an increased FOV is frequently acquired so as to position the sample properly. This study shows that the otherwise discarded scout scans can provide sufficient additional information to uniquely and stably reconstruct the interior region of interest. Two interior reconstruction methods are designed to utilize the multi-resolution data without a significant computational overhead. While most previous studies used numerically truncated global projections as interior data, this study uses truly hybrid scans where global and interior scans were carried out at different resolutions. Additionally, owing to the lack of standard interior micro-CT phantoms, we designed and fabricated novel interior micro-CT phantoms for this study to provide means of validation for our algorithms. Finally, two characteristic samples from separate studies were scanned to show the effect of our reconstructions. The presented methods show significant improvements over existing reconstruction algorithms. PMID:23732478

  4. Measuring accessibility of sustainable transportation using space syntax in Bojonggede area

    NASA Astrophysics Data System (ADS)

    Suryawinata, B. A.; Mariana, Y.; Wijaksono, S.

    2017-12-01

    Changes in the physical structure of regional space as a result of the increase of planned and unplanned settlements in the Bojonggede area have an impact on the road network pattern system. Changes in road network patterns will have an impact on the permeability of the area. Permeability measures the extent to which road network patterns provide an option in traveling. If the permeability increases the travel distance decreases and the route of travel choice increases, permeability like this can create an easy access system and physically integrated. This study aims to identify the relationship of physical characteristics of residential area and road network pattern to the level of space permeability in Bojonggede area. By conducting this research can be a reference for the arrangement of circulation, accessibility, and land use in the vicinity of Bojonggede. This research uses quantitative method and space syntax method to see global integration and local integration on the region which become the parameter of permeability level. The results showed that the level of permeability globally and locally high in Bojonggede physical area is the physical characteristics of the area that has a grid pattern of road network grid.

  5. Relationship of various factors affecting the sustainable private forest management at Pajangan District, Special Regions Yogyakarta, Indonesia

    NASA Astrophysics Data System (ADS)

    Widayanto, B.; Karsidi, R.; Kusnandar; Sutrisno, J.

    2018-03-01

    Forests have a role and function in providing good atmosphere with stable oxygen content and affecting global climate stability. Good forest management will provide stable climatic conditions in global climate change. A good forest is managed to provide a sustainable environment condition. This study aims to analyze the relationship of various factors affecting the sustainability of private forests management. This research is a quantitative research with survey method and determination of sampling are was by purposive sampling. Sampling method using multiple stage cluster sampling with 60 samples. From the results it was found that the successful sustainable private forest management influenced by various factors, such as group dynamics, stakeholder support, community institutions, and farmer participation. The continuity of private forest management is determined by the fulfillment of economic, social and environmental dimensions. The most interesting finding is that the group dynamics conditions are very good, whereas the sense of togetherness among community is very strong under limited resources managing private forests. The sense of togetherness resulted creativity to diversify business and thus reduced the pressure in exploiting the forest. Some people think that managing the people's forest as a culture so that its existence can be more sustainable.

  6. Quantitative analysis of the effect of climate change and human activities on runoff in the Liujiang River Basin

    NASA Astrophysics Data System (ADS)

    LI, X.

    2017-12-01

    Abstract: As human basic and strategic natural resources, Water resources have received an unprecedented challenge under the impacts of global climate change. Analyzing the variation characteristics of runoff and the effect of climate change and human activities on runoff could provide the basis for the reasonable utilization and management of water resources. Taking the Liujiang River Basin as the research object, the discharge data of hydrological station and meteorological data at 24 meteorological stations in the Guangxi Province as the basis, the variation characteristics of runoff and precipitation in the Liujiang River Basin was analyzed, and the quantitatively effect of climate change and human activities on runoff was proposed. The results showed that runoff and precipitation in the Liujiang River Basin had an increasing trend from 1964 to 2006. Using the method of accumulative anomaly and the orderly cluster method, the runoff series was divided into base period and change period. BP - ANN model and sensitivity coefficient method were used for quantifying the influences of climate change and human activities on runoff. We found that the most important factor which caused an increase trend of discharges in the Liujiang River Basin was precipitation. Human activities were also important factors which influenced the intra-annual distribution of runoff. Precipitation had a more sensitive influence to runoff variation than potential evaporation in the Liujiang River Basin. Key words: Liujiang River Basin, climate change, human activities, BP-ANN, sensitivity coefficient method

  7. Short-range quantitative precipitation forecasting using Deep Learning approaches

    NASA Astrophysics Data System (ADS)

    Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.

    2017-12-01

    Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.

  8. Automated classification and quantitative analysis of arterial and venous vessels in fundus images

    NASA Astrophysics Data System (ADS)

    Alam, Minhaj; Son, Taeyoon; Toslak, Devrim; Lim, Jennifer I.; Yao, Xincheng

    2018-02-01

    It is known that retinopathies may affect arteries and veins differently. Therefore, reliable differentiation of arteries and veins is essential for computer-aided analysis of fundus images. The purpose of this study is to validate one automated method for robust classification of arteries and veins (A-V) in digital fundus images. We combine optical density ratio (ODR) analysis and blood vessel tracking algorithm to classify arteries and veins. A matched filtering method is used to enhance retinal blood vessels. Bottom hat filtering and global thresholding are used to segment the vessel and skeleton individual blood vessels. The vessel tracking algorithm is used to locate the optic disk and to identify source nodes of blood vessels in optic disk area. Each node can be identified as vein or artery using ODR information. Using the source nodes as starting point, the whole vessel trace is then tracked and classified as vein or artery using vessel curvature and angle information. 50 color fundus images from diabetic retinopathy patients were used to test the algorithm. Sensitivity, specificity, and accuracy metrics were measured to assess the validity of the proposed classification method compared to ground truths created by two independent observers. The algorithm demonstrated 97.52% accuracy in identifying blood vessels as vein or artery. A quantitative analysis upon A-V classification showed that average A-V ratio of width for NPDR subjects with hypertension decreased significantly (43.13%).

  9. Spatial organization of RNA polymerase II inside a mammalian cell nucleus revealed by reflected light-sheet superresolution microscopy.

    PubMed

    Zhao, Ziqing W; Roy, Rahul; Gebhardt, J Christof M; Suter, David M; Chapman, Alec R; Xie, X Sunney

    2014-01-14

    Superresolution microscopy based on single-molecule centroid determination has been widely applied to cellular imaging in recent years. However, quantitative imaging of the mammalian nucleus has been challenging due to the lack of 3D optical sectioning methods for normal-sized cells, as well as the inability to accurately count the absolute copy numbers of biomolecules in highly dense structures. Here we report a reflected light-sheet superresolution microscopy method capable of imaging inside the mammalian nucleus with superior signal-to-background ratio as well as molecular counting with single-copy accuracy. Using reflected light-sheet superresolution microscopy, we probed the spatial organization of transcription by RNA polymerase II (RNAP II) molecules and quantified their global extent of clustering inside the mammalian nucleus. Spatiotemporal clustering analysis that leverages on the blinking photophysics of specific organic dyes showed that the majority (>70%) of the transcription foci originate from single RNAP II molecules, and no significant clustering between RNAP II molecules was detected within the length scale of the reported diameter of "transcription factories." Colocalization measurements of RNAP II molecules equally labeled by two spectrally distinct dyes confirmed the primarily unclustered distribution, arguing against a prevalent existence of transcription factories in the mammalian nucleus as previously proposed. The methods developed in our study pave the way for quantitative mapping and stoichiometric characterization of key biomolecular species deep inside mammalian cells.

  10. Barriers to global health development: An international quantitative survey.

    PubMed

    Weiss, Bahr; Pollack, Amie Alley

    2017-01-01

    Global health's goal of reducing low-and-middle-income country versus high-income country health disparities faces complex challenges. Although there have been discussions of barriers, there has not been a broad-based, quantitative survey of such barriers. 432 global health professionals were invited via email to participate in an online survey, with 268 (62%) participating. The survey assessed participants' (A) demographic and global health background, (B) perceptions regarding 66 barriers' seriousness, (C) detailed ratings of barriers designated most serious, (D) potential solutions. Thirty-four (of 66) barriers were seen as moderately or more serious, highlighting the widespread, significant challenges global health development faces. Perceived barrier seriousness differed significantly across domains: Resource Limitations mean = 2.47 (0-4 Likert scale), Priority Selection mean = 2.20, Corruption, Lack of Competence mean = 1.87, Social and Cultural Barriers mean = 1.68. Some system-level predictors showed significant but relatively limited relations. For instance, for Global Health Domain, HIV and Mental Health had higher levels of perceived Social and Cultural Barriers than other GH Domains. Individual-level global health experience predictors had small but significant effects, with seriousness of (a) Corruption, Lack of Competence, and (b) Priority Selection barriers positively correlated with respondents' level of LMIC-oriented (e.g., weeks/year spent in LMIC) but Academic Global Health Achievement (e.g., number of global health publications) negatively correlated with overall barrier seriousness. That comparatively few system-level predictors (e.g., Organization Type) were significant suggests these barriers may be relatively fundamental at the system-level. Individual-level and system-level effects do have policy implications; e.g., Priority Selection barriers were among the most serious, yet effects on seriousness of how LMIC-oriented a professional was versus level of academic global health achievement ran in opposite directions, suggesting increased discussion of priorities between LMIC-based and other professionals may be useful. It is hoped the 22 suggested solutions will provide useful ideas for addressing global health barriers.

  11. Bone-marrow densitometry: Assessment of marrow space of human vertebrae by single energy high resolution-quantitative computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peña, Jaime A.; Damm, Timo; Bastgen, Jan

    Purpose: Accurate noninvasive assessment of vertebral bone marrow fat fraction is important for diagnostic assessment of a variety of disorders and therapies known to affect marrow composition. Moreover, it provides a means to correct fat-induced bias of single energy quantitative computed tomography (QCT) based bone mineral density (BMD) measurements. The authors developed new segmentation and calibration methods to obtain quantitative surrogate measures of marrow-fat density in the axial skeleton. Methods: The authors developed and tested two high resolution-QCT (HR-QCT) based methods which permit segmentation of bone voids in between trabeculae hypothesizing that they are representative of bone marrow space. Themore » methods permit calculation of marrow content in units of mineral equivalent marrow density (MeMD). The first method is based on global thresholding and peeling (GTP) to define a volume of interest away from the transition between trabecular bone and marrow. The second method, morphological filtering (MF), uses spherical elements of different radii (0.1–1.2 mm) and automatically places them in between trabeculae to identify regions with large trabecular interspace, the bone-void space. To determine their performance, data were compared ex vivo to high-resolution peripheral CT (HR-pQCT) images as the gold-standard. The performance of the methods was tested on a set of excised human vertebrae with intact bone marrow tissue representative of an elderly population with low BMD. Results: 86% (GTP) and 87% (MF) of the voxels identified as true marrow space on HR-pQCT images were correctly identified on HR-QCT images and thus these volumes of interest can be considered to be representative of true marrow space. Within this volume, MeMD was estimated with residual errors of 4.8 mg/cm{sup 3} corresponding to accuracy errors in fat fraction on the order of 5% both for GTP and MF methods. Conclusions: The GTP and MF methods on HR-QCT images permit noninvasive localization and densitometric assessment of marrow fat with residual accuracy errors sufficient to study disorders and therapies known to affect bone marrow composition. Additionally, the methods can be used to correct BMD for fat induced bias. Application and testing in vivo and in longitudinal studies are warranted to determine the clinical performance and value of these methods.« less

  12. Rapid Protein Global Fold Determination Using Ultrasparse Sampling, High-Dynamic Range Artifact Suppression, and Time-Shared NOESY

    PubMed Central

    Coggins, Brian E.; Werner-Allen, Jonathan W.; Yan, Anthony; Zhou, Pei

    2012-01-01

    In structural studies of large proteins by NMR, global fold determination plays an increasingly important role in providing a first look at a target’s topology and reducing assignment ambiguity in NOESY spectra of fully-protonated samples. In this work, we demonstrate the use of ultrasparse sampling, a new data processing algorithm, and a 4-D time-shared NOESY experiment (1) to collect all NOEs in 2H/13C/15N-labeled protein samples with selectively-protonated amide and ILV methyl groups at high resolution in only four days, and (2) to calculate global folds from this data using fully automated resonance assignment. The new algorithm, SCRUB, incorporates the CLEAN method for iterative artifact removal, but applies an additional level of iteration, permitting real signals to be distinguished from noise and allowing nearly all artifacts generated by real signals to be eliminated. In simulations with 1.2% of the data required by Nyquist sampling, SCRUB achieves a dynamic range over 10000:1 (250× better artifact suppression than CLEAN) and completely quantitative reproduction of signal intensities, volumes, and lineshapes. Applied to 4-D time-shared NOESY data, SCRUB processing dramatically reduces aliasing noise from strong diagonal signals, enabling the identification of weak NOE crosspeaks with intensities 100× less than diagonal signals. Nearly all of the expected peaks for interproton distances under 5 Å were observed. The practical benefit of this method is demonstrated with structure calculations for 23 kDa and 29 kDa test proteins using the automated assignment protocol of CYANA, in which unassigned 4-D time-shared NOESY peak lists produce accurate and well-converged global fold ensembles, whereas 3-D peak lists either fail to converge or produce significantly less accurate folds. The approach presented here succeeds with an order of magnitude less sampling than required by alternative methods for processing sparse 4-D data. PMID:22946863

  13. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  14. Rapid screening and determination of 11 new psychoactive substances by direct analysis in real time mass spectrometry and liquid chromatography/quadrupole time-of-flight mass spectrometry.

    PubMed

    Nie, Honggang; Li, Xianjiang; Hua, Zhendong; Pan, Wei; Bai, Yanping; Fu, Xiaofang

    2016-08-01

    With the amounts and types of new psychoactive substances (NPSs) increasing rapidly in recent years, an excellent high-throughput method for the analysis of these compounds is urgently needed. In this article, a rapid screening method and a quantitative analysis method for 11 NPSs are described and compared, respectively. A simple direct analysis in real time mass spectrometry (DART-MS) method was developed for the analysis of 11 NPSs including three categories of these substances present on the global market such as four cathinones, one phenylethylamine, and six synthetic cannabinoids. In order to analyze these compounds quantitatively with better accuracy and sensitivity, another rapid analytical method with a low limit of detection (LOD) was also developed using liquid chromatography/electrospray ionization quadrupole time-of-flight mass spectrometry (LC/QTOFMS). The 11 NPSs could be determined within 0.5 min by DART-MS. Furthermore, they could also be separated and determined within 5 min by the LC/QTOFMS method. The two methods both showed good linearity with correlation coefficients (r(2) ) higher than 0.99. The LODs for all these target NPSs by DART-MS and LC/QTOFMS ranged from 5 to 40 ng mL(-1) and 0.1 to 1 ng mL(-1) , respectively. Confiscated samples, named as "music vanilla" and "bath salt", and 11 spiked samples were firstly screened by DART-MS and then determined by LC/QTOFMS. The identification of NPSs in confiscated materials was successfully achieved, and the proposed analytical methodology could offer rapid screening and accurate analysis results. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Measuring global water security towards sustainable development goals

    NASA Astrophysics Data System (ADS)

    Gain, Animesh K.; Giupponi, Carlo; Wada, Yoshihide

    2016-12-01

    Water plays an important role in underpinning equitable, stable and productive societies and ecosystems. Hence, United Nations recognized ensuring water security as one (Goal 6) of the seventeen sustainable development goals (SDGs). Many international river basins are likely to experience ‘low water security’ over the coming decades. Water security is rooted not only in the physical availability of freshwater resources relative to water demand, but also on social and economic factors (e.g. sound water planning and management approaches, institutional capacity to provide water services, sustainable economic policies). Until recently, advanced tools and methods are available for the assessment of water scarcity. However, quantitative and integrated—physical and socio-economic—approaches for spatial analysis of water security at global level are not available yet. In this study, we present a spatial multi-criteria analysis framework to provide a global assessment of water security. The selected indicators are based on Goal 6 of SDGs. The term ‘security’ is conceptualized as a function of ‘availability’, ‘accessibility to services’, ‘safety and quality’, and ‘management’. The proposed global water security index (GWSI) is calculated by aggregating indicator values on a pixel-by-pixel basis, using the ordered weighted average method, which allows for the exploration of the sensitivity of final maps to different attitudes of hypothetical policy makers. Our assessment suggests that countries of Africa, South Asia and Middle East experience very low water security. Other areas of high water scarcity, such as some parts of United States, Australia and Southern Europe, show better GWSI values, due to good performance of management, safety and quality, and accessibility. The GWSI maps show the areas of the world in which integrated strategies are needed to achieve water related targets of the SDGs particularly in the African and Asian continents.

  16. Measuring Global Water Security Towards Sustainable Development Goals

    NASA Technical Reports Server (NTRS)

    Gain, Animesh K.; Giupponi, Carlo; Wada, Yoshihide

    2016-01-01

    Water plays an important role in underpinning equitable, stable and productive societies and ecosystems. Hence, United Nations recognized ensuring water security as one (Goal 6) of the seventeen sustainable development goals (SDGs). Many international river basins are likely to experience 'low water security' over the coming decades. Water security is rooted not only in the physical availability of freshwater resources relative to water demand, but also on social and economic factors (e.g. sound water planning and management approaches, institutional capacity to provide water services, sustainable economic policies). Until recently, advanced tools and methods are available for the assessment of water scarcity. However, quantitative and integrated-physical and socio-economic-approaches for spatial analysis of water security at global level are not available yet. In this study, we present a spatial multi-criteria analysis framework to provide a global assessment of water security. The selected indicators are based on Goal 6 of SDGs. The term 'security' is conceptualized as a function of 'availability', 'accessibility to services', 'safety and quality', and 'management'. The proposed global water security index (GWSI) is calculated by aggregating indicator values on a pixel-by-pixel basis, using the ordered weighted average method, which allows for the exploration of the sensitivity of final maps to different attitudes of hypothetical policy makers. Our assessment suggests that countries of Africa, South Asia and Middle East experience very low water security. Other areas of high water scarcity, such as some parts of United States, Australia and Southern Europe, show better GWSI values, due to good performance of management, safety and quality, and accessibility. The GWSI maps show the areas of the world in which integrated strategies are needed to achieve water related targets of the SDGs particularly in the African and Asian continents.

  17. Probing Mantle Heterogeneity Across Spatial Scales

    NASA Astrophysics Data System (ADS)

    Hariharan, A.; Moulik, P.; Lekic, V.

    2017-12-01

    Inferences of mantle heterogeneity in terms of temperature, composition, grain size, melt and crystal structure may vary across local, regional and global scales. Probing these scale-dependent effects require quantitative comparisons and reconciliation of tomographic models that vary in their regional scope, parameterization, regularization and observational constraints. While a range of techniques like radial correlation functions and spherical harmonic analyses have revealed global features like the dominance of long-wavelength variations in mantle heterogeneity, they have limited applicability for specific regions of interest like subduction zones and continental cratons. Moreover, issues like discrepant 1-D reference Earth models and related baseline corrections have impeded the reconciliation of heterogeneity between various regional and global models. We implement a new wavelet-based approach that allows for structure to be filtered simultaneously in both the spectral and spatial domain, allowing us to characterize heterogeneity on a range of scales and in different geographical regions. Our algorithm extends a recent method that expanded lateral variations into the wavelet domain constructed on a cubed sphere. The isolation of reference velocities in the wavelet scaling function facilitates comparisons between models constructed with arbitrary 1-D reference Earth models. The wavelet transformation allows us to quantify the scale-dependent consistency between tomographic models in a region of interest and investigate the fits to data afforded by heterogeneity at various dominant wavelengths. We find substantial and spatially varying differences in the spectrum of heterogeneity between two representative global Vp models constructed using different data and methodologies. Applying the orthonormality of the wavelet expansion, we isolate detailed variations in velocity from models and evaluate additional fits to data afforded by adding such complexities to long-wavelength variations. Our method provides a way to probe and evaluate localized features in a multi-scale description of mantle heterogeneity.

  18. The Effects of the Project Champion's Leadership Style on Global Information Technology User Acceptance and Use

    ERIC Educational Resources Information Center

    Ekiko, Mbong C.

    2014-01-01

    The research problem was the lack of knowledge about the effect of leadership style of the project champion on global information technology (IT) project outcomes, resulting in a high failure rate of IT projects accompanied by significant waste of resources. The purpose of this quantitative, nonexperimental study was to evaluate the relationship…

  19. Forward Looking: Structural Change and Institutions in Highestincome Countries and Globally

    ERIC Educational Resources Information Center

    Ahamer, Gilbert; Mayer, Johannes

    2013-01-01

    Purpose: Structural economic shifts are a key sign of development in all stages globally; and these shifts may also result in the changing roles of institutions. The purpose of this paper is to quantitatively analyse trends that may be used for so-called forward looking and makes use of them to recommend strategies for reorganising institutions.…

  20. Comparison of low- and ultralow-dose computed tomography protocols for quantitative lung and airway assessment.

    PubMed

    Hammond, Emily; Sloan, Chelsea; Newell, John D; Sieren, Jered P; Saylor, Melissa; Vidal, Craig; Hogue, Shayna; De Stefano, Frank; Sieren, Alexa; Hoffman, Eric A; Sieren, Jessica C

    2017-09-01

    Quantitative computed tomography (CT) measures are increasingly being developed and used to characterize lung disease. With recent advances in CT technologies, we sought to evaluate the quantitative accuracy of lung imaging at low- and ultralow-radiation doses with the use of iterative reconstruction (IR), tube current modulation (TCM), and spectral shaping. We investigated the effect of five independent CT protocols reconstructed with IR on quantitative airway measures and global lung measures using an in vivo large animal model as a human subject surrogate. A control protocol was chosen (NIH-SPIROMICS + TCM) and five independent protocols investigating TCM, low- and ultralow-radiation dose, and spectral shaping. For all scans, quantitative global parenchymal measurements (mean, median and standard deviation of the parenchymal HU, along with measures of emphysema) and global airway measurements (number of segmented airways and pi10) were generated. In addition, selected individual airway measurements (minor and major inner diameter, wall thickness, inner and outer area, inner and outer perimeter, wall area fraction, and inner equivalent circle diameter) were evaluated. Comparisons were made between control and target protocols using difference and repeatability measures. Estimated CT volume dose index (CTDIvol) across all protocols ranged from 7.32 mGy to 0.32 mGy. Low- and ultralow-dose protocols required more manual editing and resolved fewer airway branches; yet, comparable pi10 whole lung measures were observed across all protocols. Similar trends in acquired parenchymal and airway measurements were observed across all protocols, with increased measurement differences using the ultralow-dose protocols. However, for small airways (1.9 ± 0.2 mm) and medium airways (5.7 ± 0.4 mm), the measurement differences across all protocols were comparable to the control protocol repeatability across breath holds. Diameters, wall thickness, wall area fraction, and equivalent diameter had smaller measurement differences than area and perimeter measurements. In conclusion, the use of IR with low- and ultralow-dose CT protocols with CT volume dose indices down to 0.32 mGy maintains selected quantitative parenchymal and airway measurements relevant to pulmonary disease characterization. © 2017 American Association of Physicists in Medicine.

  1. The Global Precipitation Mission

    NASA Technical Reports Server (NTRS)

    Braun, Scott; Kummerow, Christian

    2000-01-01

    The Global Precipitation Mission (GPM), expected to begin around 2006, is a follow-up to the Tropical Rainfall Measuring Mission (TRMM). Unlike TRMM, which primarily samples the tropics, GPM will sample both the tropics and mid-latitudes. The primary, or core, satellite will be a single, enhanced TRMM satellite that can quantify the 3-D spatial distributions of precipitation and its associated latent heat release. The core satellite will be complemented by a constellation of very small and inexpensive drones with passive microwave instruments that will sample the rainfall with sufficient frequency to be not only of climate interest, but also have local, short-term impacts by providing global rainfall coverage at approx. 3 h intervals. The data is expected to have substantial impact upon quantitative precipitation estimation/forecasting and data assimilation into global and mesoscale numerical models. Based upon previous studies of rainfall data assimilation, GPM is expected to lead to significant improvements in forecasts of extratropical and tropical cyclones. For example, GPM rainfall data can provide improved initialization of frontal systems over the Pacific and Atlantic Oceans. The purpose of this talk is to provide information about GPM to the USWRP (U.S. Weather Research Program) community and to discuss impacts on quantitative precipitation estimation/forecasting and data assimilation.

  2. Kinome-wide Decoding of Network-Attacking Mutations Rewiring Cancer Signaling

    PubMed Central

    Creixell, Pau; Schoof, Erwin M.; Simpson, Craig D.; Longden, James; Miller, Chad J.; Lou, Hua Jane; Perryman, Lara; Cox, Thomas R.; Zivanovic, Nevena; Palmeri, Antonio; Wesolowska-Andersen, Agata; Helmer-Citterich, Manuela; Ferkinghoff-Borg, Jesper; Itamochi, Hiroaki; Bodenmiller, Bernd; Erler, Janine T.; Turk, Benjamin E.; Linding, Rune

    2015-01-01

    Summary Cancer cells acquire pathological phenotypes through accumulation of mutations that perturb signaling networks. However, global analysis of these events is currently limited. Here, we identify six types of network-attacking mutations (NAMs), including changes in kinase and SH2 modulation, network rewiring, and the genesis and extinction of phosphorylation sites. We developed a computational platform (ReKINect) to identify NAMs and systematically interpreted the exomes and quantitative (phospho-)proteomes of five ovarian cancer cell lines and the global cancer genome repository. We identified and experimentally validated several NAMs, including PKCγ M501I and PKD1 D665N, which encode specificity switches analogous to the appearance of kinases de novo within the kinome. We discover mutant molecular logic gates, a drift toward phospho-threonine signaling, weakening of phosphorylation motifs, and kinase-inactivating hotspots in cancer. Our method pinpoints functional NAMs, scales with the complexity of cancer genomes and cell signaling, and may enhance our capability to therapeutically target tumor-specific networks. PMID:26388441

  3. Overview of Supersonic Aerodynamics Measurement Techniques in the NASA Langley Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Erickson, Gary E.

    2007-01-01

    An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.

  4. Ptaquiloside, the major carcinogen of bracken fern, in the pooled raw milk of healthy sheep and goats: an underestimated, global concern of food safety.

    PubMed

    Virgilio, Antonella; Sinisi, Annamaria; Russo, Valeria; Gerardo, Salvatore; Santoro, Adriano; Galeone, Aldo; Taglialatela-Scafati, Orazio; Roperto, Franco

    2015-05-20

    Bracken fern (Pteridium aquilinum) is a worldwide plant containing toxic substances, which represent an important chemical hazard for animals, including humans. Ptaquiloside, 1, a norsesquiterpenoid glucoside, is the major carcinogen of bracken detected in the food chain, particularly in the milk from farm animals. To date, ptaquiloside has been shown in the milk of cows feeding on a diet containing bracken fern. This is the first study that shows the systematic detection of ptaquiloside, 1, and reports its direct quantitation in pooled raw milk of healthy sheep and goats grazing on bracken. Ptaquiloside, 1, was detected by a sensitive method based on the chemical conversion of ptaquiloside, 1, into bromopterosine, 4, following gas chromatography-mass spectrometry (GC-MS) analysis. The presence of ptaquiloside, 1, possibly carcinogenic to humans, in the milk of healthy animals is an unknown potential health risk, thus representing a harmful and potential global concern of food safety.

  5. "Mad or bad?": burden on caregivers of patients with personality disorders.

    PubMed

    Bauer, Rita; Döring, Antje; Schmidt, Tanja; Spießl, Hermann

    2012-12-01

    The burden on caregivers of patients with personality disorders is often greatly underestimated or completely disregarded. Possibilities for caregiver support have rarely been assessed. Thirty interviews were conducted with caregivers of such patients to assess illness-related burden. Responses were analyzed with a mixed method of qualitative and quantitative analysis in a sequential design. Patient and caregiver data, including sociodemographic and disease-related variables, were evaluated with regression analysis and regression trees. Caregiver statements (n = 404) were summarized into 44 global statements. The most frequent global statements were worries about the burden on other family members (70.0%), poor cooperation with clinical centers and other institutions (60.0%), financial burden (56.7%), worry about the patient's future (53.3%), and dissatisfaction with the patient's treatment and rehabilitation (53.3%). Linear regression and regression tree analysis identified predictors for more burdened caregivers. Caregivers of patients with personality disorders experience a variety of burdens, some disorder specific. Yet these caregivers often receive little attention or support.

  6. Long-term effects on nursing alumni: Assessing a course in public and global health.

    PubMed

    Palmer, Sheri P; Lundberg, Karen; de la Cruz, Karen; Corbett, Cheryl; Heaston, Sondra; Reed, Shelly; Williams, Mary

    The impact of a cultural awareness course among nursing students may affect the particular person for years to come. Cultural awareness can be taught via many methods, often requiring study abroad and/or extreme investment of time, money and effort. There is little research on sustained effects on nursing alumni from such a course. The purpose of this descriptive survey study was to determine the long term outcomes of a cultural awareness course and 2) compare the long term effects between alumni who went abroad and those who chose to complete the course locally. One hundred and twenty-one nursing alumni completed the International Education Survey (IES) (Zorn, 1996) with additional open-ended questions. Quantitative and qualitative results concluded: 1) nursing alumni were influenced long term by a course dedicated to public and global health and 2) all alumni had statistically significant IES scores but alumni who studied abroad had the greatest increase. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Multifractal and wavelet analysis of epileptic seizures

    NASA Astrophysics Data System (ADS)

    Dick, Olga E.; Mochovikova, Irina A.

    The aim of the study is to develop quantitative parameters of human electroencephalographic (EEG) recordings with epileptic seizures. We used long-lasting recordings from subjects with epilepsy obtained as part of their clinical investigation. The continuous wavelet transform of the EEG segments and the wavelet-transform modulus maxima method enable us to evaluate the energy spectra of the segments, to fin lines of local maximums, to gain the scaling exponents and to construct the singularity spectra. We have shown that the significant increase of the global energy with respect to background and the redistribution of the energy over the frequency range are observed in the patterns involving the epileptic activity. The singularity spectra expand so that the degree of inhomogenety and multifractality of the patterns enhances. Comparing the results gained for the patterns during different functional probes such as open and closed eyes or hyperventilation we demonstrate the high sensitivity of the analyzed parameters (the maximal global energy, the width and asymmetry of the singularity spectrum) for detecting the epileptic patterns.

  8. Evaluation of Daily Extreme Precipitation Derived From Long-term Global Satellite Quantitative Precipitation Estimates (QPEs)

    NASA Astrophysics Data System (ADS)

    Prat, O. P.; Nelson, B. R.; Nickl, E.; Ferraro, R. R.

    2017-12-01

    This study evaluates the ability of different satellite-based precipitation products to capture daily precipitation extremes over the entire globe. The satellite products considered are the datasets belonging to the Reference Environmental Data Records (REDRs) program (PERSIANN-CDR, GPCP, CMORPH, AMSU-A,B, Hydrologic bundle). Those products provide long-term global records of daily adjusted Quantitative Precipitation Estimates (QPEs) that range from 20-year (CMORPH-CDR) to 35-year (PERSIANN-CDR, GPCP) record of daily adjusted global precipitation. The AMSU-A,B, Hydro-bundle is an 11-year record of daily rain rate over land and ocean, snow cover and surface temperature over land, and sea ice concentration, cloud liquid water, and total precipitable water over ocean among others. The aim of this work is to evaluate the ability of the different satellite QPE products to capture daily precipitation extremes. This evaluation will also include comparison with in-situ data sets at the daily scale from the Global Historical Climatology Network (GHCN-Daily), the Global Precipitation Climatology Centre (GPCC) gridded full data daily product, and the US Climate Reference Network (USCRN). In addition, while the products mentioned above only provide QPEs, the AMSU-A,B hydro-bundle provides additional hydrological information (precipitable water, cloud liquid water, snow cover, sea ice concentration). We will also present an analysis of those additional variables available from global satellite measurements and their relevance and complementarity in the context of long-term hydrological and climate studies.

  9. Emerging systems biology approaches in nanotoxicology: Towards a mechanism-based understanding of nanomaterial hazard and risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade

    Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less

  10. Alcohol Use Among Female Sex Workers and Male Clients: An Integrative Review of Global Literature

    PubMed Central

    Li, Qing; Li, Xiaoming; Stanton, Bonita

    2010-01-01

    Aims: To review the patterns, contexts and impacts of alcohol use associated with commercial sex reported in the global literature. Methods: We identified peer-reviewed English-language articles from 1980 to 2008 reporting alcohol consumption among female sex workers (FSWs) or male clients. We retrieved 70 articles describing 76 studies, in which 64 were quantitative (52 for FSWs, 12 for male clients) and 12 qualitative. Results: Studies increased over the past three decades, with geographic concentration of the research in Asia and North America. Alcohol use was prevalent among FSWs and clients. Integrating quantitative and qualitative studies, multilevel contexts of alcohol use in the sex work environment were identified, including workplace and occupation-related use, the use of alcohol to facilitate the transition into and practice of commercial sex among both FSWs and male clients, and self-medication among FSWs. Alcohol use was associated with adverse physical health, illicit drug use, mental health problems, and victimization of sexual violence, although its associations with HIV/sexually transmitted infections and unprotected sex among FSWs were inconclusive. Conclusions: Alcohol use in the context of commercial sex is prevalent, harmful among FSWs and male clients, but under-researched. Research in this area in more diverse settings and with standardized measures is required. The review underscores the importance of integrated intervention for alcohol use and related problems in multilevel contexts and with multiple components in order to effectively reduce alcohol use and its harmful effects among FSWs and their clients. PMID:20089544

  11. A CT-based software tool for evaluating compensator quality in passively scattered proton therapy

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zhang, Lifei; Dong, Lei; Sahoo, Narayan; Gillin, Michael T.; Zhu, X. Ronald

    2010-11-01

    We have developed a quantitative computed tomography (CT)-based quality assurance (QA) tool for evaluating the accuracy of manufactured compensators used in passively scattered proton therapy. The thickness of a manufactured compensator was measured from its CT images and compared with the planned thickness defined by the treatment planning system. The difference between the measured and planned thicknesses was calculated with use of the Euclidean distance transformation and the kd-tree search method. Compensator accuracy was evaluated by examining several parameters including mean distance, maximum distance, global thickness error and central axis shifts. Two rectangular phantoms were used to validate the performance of the QA tool. Nine patients and 20 compensators were included in this study. We found that mean distances, global thickness errors and central axis shifts were all within 1 mm for all compensators studied, with maximum distances ranging from 1.1 to 3.8 mm. Although all compensators passed manual verification at selected points, about 5% of the pixels still had maximum distances of >2 mm, most of which correlated with large depth gradients. The correlation between the mean depth gradient of the compensator and the percentage of pixels with mean distance <1 mm is -0.93 with p < 0.001, which suggests that the mean depth gradient is a good indicator of compensator complexity. These results demonstrate that the CT-based compensator QA tool can be used to quantitatively evaluate manufactured compensators.

  12. Identifiability and Identification of Trace Continuous Pollutant Source

    PubMed Central

    Qu, Hongquan; Liu, Shouwen; Pang, Liping; Hu, Tao

    2014-01-01

    Accidental pollution events often threaten people's health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions. PMID:24892041

  13. Motion-compensated speckle tracking via particle filtering

    NASA Astrophysics Data System (ADS)

    Liu, Lixin; Yagi, Shin-ichi; Bian, Hongyu

    2015-07-01

    Recently, an improved motion compensation method that uses the sum of absolute differences (SAD) has been applied to frame persistence utilized in conventional ultrasonic imaging because of its high accuracy and relative simplicity in implementation. However, high time consumption is still a significant drawback of this space-domain method. To seek for a more accelerated motion compensation method and verify if it is possible to eliminate conventional traversal correlation, motion-compensated speckle tracking between two temporally adjacent B-mode frames based on particle filtering is discussed. The optimal initial density of particles, the least number of iterations, and the optimal transition radius of the second iteration are analyzed from simulation results for the sake of evaluating the proposed method quantitatively. The speckle tracking results obtained using the optimized parameters indicate that the proposed method is capable of tracking the micromotion of speckle throughout the region of interest (ROI) that is superposed with global motion. The computational cost of the proposed method is reduced by 25% compared with that of the previous algorithm and further improvement is necessary.

  14. Mass Spectrometry Analysis of Spatial Protein Networks by Colocalization Analysis (COLA).

    PubMed

    Mardakheh, Faraz K

    2017-01-01

    A major challenge in systems biology is comprehensive mapping of protein interaction networks. Crucially, such interactions are often dynamic in nature, necessitating methods that can rapidly mine the interactome across varied conditions and treatments to reveal change in the interaction networks. Recently, we described a fast mass spectrometry-based method to reveal functional interactions in mammalian cells on a global scale, by revealing spatial colocalizations between proteins (COLA) (Mardakheh et al., Mol Biosyst 13:92-105, 2017). As protein localization and function are inherently linked, significant colocalization between two proteins is a strong indication for their functional interaction. COLA uses rapid complete subcellular fractionation, coupled with quantitative proteomics to generate a subcellular localization profile for each protein quantified by the mass spectrometer. Robust clustering is then applied to reveal significant similarities in protein localization profiles, indicative of colocalization.

  15. A fully-automated multiscale kernel graph cuts based particle localization scheme for temporal focusing two-photon microscopy

    NASA Astrophysics Data System (ADS)

    Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei

    2017-03-01

    The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.

  16. Advancements in the use of speleothems as climate archives

    NASA Astrophysics Data System (ADS)

    Wong, Corinne I.; Breecker, Daniel O.

    2015-11-01

    Speleothems have become a cornerstone of the approach to better understanding Earth's climatic teleconnections due to their precise absolute chronologies, their continuous or semicontinuous deposition and their global terrestrial distribution. We review the last decade of speleothem-related research, building off a similar review by McDermott (2004), in three themes - i) investigation of global teleconnections using speleothem-based climate reconstructions, ii) refinement of climate interpretations from speleothem proxies through cave monitoring, and iii) novel, technical methods of speleothem-based climate reconstructions. Speleothem records have enabled critical insight into the response of global hydroclimate to large climate changes. This includes the relevant forcings and sequence of climatic responses involved in glacial terminations and recognition of a global monsoon response to climate changes on orbital and millennial time scales. We review advancements in understanding of the processes that control speleothem δ13C values and introduce the idea of a direct atmospheric pCO2 influence. We discuss progress in understanding kinetic isotope fractionation, which, with further advances, may help quantify paleoclimate changes despite non-equilibrium formation of speleothems. This feeds into the potential of proxy system modeling to consider climatic, hydrological and biogeochemical processes with the objective of quantitatively interpreting speleothem proxies. Finally, we provide an overview of emerging speleothem proxies and novel approaches using existing proxies. Most recently, technical advancements made in the measurement of fluid inclusions are now yielding reliable determinations of paleotemperatures.

  17. Barriers and Delays in Tuberculosis Diagnosis and Treatment Services: Does Gender Matter?

    PubMed Central

    Yang, Wei-Teng; Gounder, Celine R.; Akande, Tokunbo; De Neve, Jan-Walter; McIntire, Katherine N.; Chandrasekhar, Aditya; de Lima Pereira, Alan; Gummadi, Naveen; Samanta, Santanu; Gupta, Amita

    2014-01-01

    Background. Tuberculosis (TB) remains a global public health problem with known gender-related disparities. We reviewed the quantitative evidence for gender-related differences in accessing TB services from symptom onset to treatment initiation. Methods. Following a systematic review process, we: searched 12 electronic databases; included quantitative studies assessing gender differences in accessing TB diagnostic and treatment services; abstracted data; and assessed study validity. We defined barriers and delays at the individual and provider/system levels using a conceptual framework of the TB care continuum and examined gender-related differences. Results. Among 13,448 articles, 137 were included: many assessed individual-level barriers (52%) and delays (42%), 76% surveyed persons presenting for care with diagnosed or suspected TB, 24% surveyed community members, and two-thirds were from African and Asian regions. Many studies reported no gender differences. Among studies reporting disparities, women faced greater barriers (financial: 64% versus 36%; physical: 100% versus 0%; stigma: 85% versus 15%; health literacy: 67% versus 33%; and provider-/system-level: 100% versus 0%) and longer delays (presentation to diagnosis: 45% versus 0%) than men. Conclusions. Many studies found no quantitative gender-related differences in barriers and delays limiting access to TB services. When differences were identified, women experienced greater barriers and longer delays than men. PMID:24876956

  18. Barriers and delays in tuberculosis diagnosis and treatment services: does gender matter?

    PubMed

    Yang, Wei-Teng; Gounder, Celine R; Akande, Tokunbo; De Neve, Jan-Walter; McIntire, Katherine N; Chandrasekhar, Aditya; de Lima Pereira, Alan; Gummadi, Naveen; Samanta, Santanu; Gupta, Amita

    2014-01-01

    Background. Tuberculosis (TB) remains a global public health problem with known gender-related disparities. We reviewed the quantitative evidence for gender-related differences in accessing TB services from symptom onset to treatment initiation. Methods. Following a systematic review process, we: searched 12 electronic databases; included quantitative studies assessing gender differences in accessing TB diagnostic and treatment services; abstracted data; and assessed study validity. We defined barriers and delays at the individual and provider/system levels using a conceptual framework of the TB care continuum and examined gender-related differences. Results. Among 13,448 articles, 137 were included: many assessed individual-level barriers (52%) and delays (42%), 76% surveyed persons presenting for care with diagnosed or suspected TB, 24% surveyed community members, and two-thirds were from African and Asian regions. Many studies reported no gender differences. Among studies reporting disparities, women faced greater barriers (financial: 64% versus 36%; physical: 100% versus 0%; stigma: 85% versus 15%; health literacy: 67% versus 33%; and provider-/system-level: 100% versus 0%) and longer delays (presentation to diagnosis: 45% versus 0%) than men. Conclusions. Many studies found no quantitative gender-related differences in barriers and delays limiting access to TB services. When differences were identified, women experienced greater barriers and longer delays than men.

  19. Integrated genomics and molecular breeding approaches for dissecting the complex quantitative traits in crop plants.

    PubMed

    Kujur, Alice; Saxena, Maneesha S; Bajaj, Deepak; Laxmi; Parida, Swarup K

    2013-12-01

    The enormous population growth, climate change and global warming are now considered major threats to agriculture and world's food security. To improve the productivity and sustainability of agriculture, the development of highyielding and durable abiotic and biotic stress-tolerant cultivars and/climate resilient crops is essential. Henceforth, understanding the molecular mechanism and dissection of complex quantitative yield and stress tolerance traits is the prime objective in current agricultural biotechnology research. In recent years, tremendous progress has been made in plant genomics and molecular breeding research pertaining to conventional and next-generation whole genome, transcriptome and epigenome sequencing efforts, generation of huge genomic, transcriptomic and epigenomic resources and development of modern genomics-assisted breeding approaches in diverse crop genotypes with contrasting yield and abiotic stress tolerance traits. Unfortunately, the detailed molecular mechanism and gene regulatory networks controlling such complex quantitative traits is not yet well understood in crop plants. Therefore, we propose an integrated strategies involving available enormous and diverse traditional and modern -omics (structural, functional, comparative and epigenomics) approaches/resources and genomics-assisted breeding methods which agricultural biotechnologist can adopt/utilize to dissect and decode the molecular and gene regulatory networks involved in the complex quantitative yield and stress tolerance traits in crop plants. This would provide clues and much needed inputs for rapid selection of novel functionally relevant molecular tags regulating such complex traits to expedite traditional and modern marker-assisted genetic enhancement studies in target crop species for developing high-yielding stress-tolerant varieties.

  20. A Bayesian method to quantify azimuthal anisotropy model uncertainties: application to global azimuthal anisotropy in the upper mantle and transition zone

    NASA Astrophysics Data System (ADS)

    Yuan, K.; Beghein, C.

    2018-04-01

    Seismic anisotropy is a powerful tool to constrain mantle deformation, but its existence in the deep upper mantle and topmost lower mantle is still uncertain. Recent results from higher mode Rayleigh waves have, however, revealed the presence of 1 per cent azimuthal anisotropy between 300 and 800 km depth, and changes in azimuthal anisotropy across the mantle transition zone boundaries. This has important consequences for our understanding of mantle convection patterns and deformation of deep mantle material. Here, we propose a Bayesian method to model depth variations in azimuthal anisotropy and to obtain quantitative uncertainties on the fast seismic direction and anisotropy amplitude from phase velocity dispersion maps. We applied this new method to existing global fundamental and higher mode Rayleigh wave phase velocity maps to assess the likelihood of azimuthal anisotropy in the deep upper mantle and to determine whether previously detected changes in anisotropy at the transition zone boundaries are robustly constrained by those data. Our results confirm that deep upper-mantle azimuthal anisotropy is favoured and well constrained by the higher mode data employed. The fast seismic directions are in agreement with our previously published model. The data favour a model characterized, on average, by changes in azimuthal anisotropy at the top and bottom of the transition zone. However, this change in fast axes is not a global feature as there are regions of the model where the azimuthal anisotropy direction is unlikely to change across depths in the deep upper mantle. We were, however, unable to detect any clear pattern or connection with surface tectonics. Future studies will be needed to further improve the lateral resolution of this type of model at transition zone depths.

  1. A Targeted Quantitative Proteomics Strategy for Global Kinome Profiling of Cancer Cells and Tissues*

    PubMed Central

    Xiao, Yongsheng; Guo, Lei; Wang, Yinsheng

    2014-01-01

    Kinases are among the most intensively pursued enzyme superfamilies as targets for anti-cancer drugs. Large data sets on inhibitor potency and selectivity for more than 400 human kinases became available recently, offering the opportunity to design rationally novel kinase-based anti-cancer therapies. However, the expression levels and activities of kinases are highly heterogeneous among different types of cancer and even among different stages of the same cancer. The lack of effective strategy for profiling the global kinome hampers the development of kinase-targeted cancer chemotherapy. Here, we introduced a novel global kinome profiling method, based on our recently developed isotope-coded ATP-affinity probe and a targeted proteomic method using multiple-reaction monitoring (MRM), for assessing simultaneously the expression of more than 300 kinases in human cells and tissues. This MRM-based assay displayed much better sensitivity, reproducibility, and accuracy than the discovery-based shotgun proteomic method. Approximately 250 kinases could be routinely detected in the lysate of a single cell line. Additionally, the incorporation of iRT into MRM kinome library rendered our MRM kinome assay easily transferrable across different instrument platforms and laboratories. We further employed this approach for profiling kinase expression in two melanoma cell lines, which revealed substantial kinome reprogramming during cancer progression and demonstrated an excellent correlation between the anti-proliferative effects of kinase inhibitors and the expression levels of their target kinases. Therefore, this facile and accurate kinome profiling assay, together with the kinome-inhibitor interaction map, could provide invaluable knowledge to predict the effectiveness of kinase inhibitor drugs and offer the opportunity for individualized cancer chemotherapy. PMID:24520089

  2. Integrative eQTL analysis of tumor and host omics data in individuals with bladder cancer.

    PubMed

    Pineda, Silvia; Van Steen, Kristel; Malats, Núria

    2017-09-01

    Integrative analyses of several omics data are emerging. The data are usually generated from the same source material (i.e., tumor sample) representing one level of regulation. However, integrating different regulatory levels (i.e., blood) with those from tumor may also reveal important knowledge about the human genetic architecture. To model this multilevel structure, an integrative-expression quantitative trait loci (eQTL) analysis applying two-stage regression (2SR) was proposed. This approach first regressed tumor gene expression levels with tumor markers and the adjusted residuals from the previous model were then regressed with the germline genotypes measured in blood. Previously, we demonstrated that penalized regression methods in combination with a permutation-based MaxT method (Global-LASSO) is a promising tool to fix some of the challenges that high-throughput omics data analysis imposes. Here, we assessed whether Global-LASSO can also be applied when tumor and blood omics data are integrated. We further compared our strategy with two 2SR-approaches, one using multiple linear regression (2SR-MLR) and other using LASSO (2SR-LASSO). We applied the three models to integrate genomic, epigenomic, and transcriptomic data from tumor tissue with blood germline genotypes from 181 individuals with bladder cancer included in the TCGA Consortium. Global-LASSO provided a larger list of eQTLs than the 2SR methods, identified a previously reported eQTLs in prostate stem cell antigen (PSCA), and provided further clues on the complexity of APBEC3B loci, with a minimal false-positive rate not achieved by 2SR-MLR. It also represents an important contribution for omics integrative analysis because it is easy to apply and adaptable to any type of data. © 2017 WILEY PERIODICALS, INC.

  3. Segmentation and detection of fluorescent 3D spots.

    PubMed

    Ram, Sundaresh; Rodríguez, Jeffrey J; Bosco, Giovanni

    2012-03-01

    The 3D spatial organization of genes and other genetic elements within the nucleus is important for regulating gene expression. Understanding how this spatial organization is established and maintained throughout the life of a cell is key to elucidating the many layers of gene regulation. Quantitative methods for studying nuclear organization will lead to insights into the molecular mechanisms that maintain gene organization as well as serve as diagnostic tools for pathologies caused by loss of nuclear structure. However, biologists currently lack automated and high throughput methods for quantitative and qualitative global analysis of 3D gene organization. In this study, we use confocal microscopy and fluorescence in-situ hybridization (FISH) as a cytogenetic technique to detect and localize the presence of specific DNA sequences in 3D. FISH uses probes that bind to specific targeted locations on the chromosomes, appearing as fluorescent spots in 3D images obtained using fluorescence microscopy. In this article, we propose an automated algorithm for segmentation and detection of 3D FISH spots. The algorithm is divided into two stages: spot segmentation and spot detection. Spot segmentation consists of 3D anisotropic smoothing to reduce the effect of noise, top-hat filtering, and intensity thresholding, followed by 3D region-growing. Spot detection uses a Bayesian classifier with spot features such as volume, average intensity, texture, and contrast to detect and classify the segmented spots as either true or false spots. Quantitative assessment of the proposed algorithm demonstrates improved segmentation and detection accuracy compared to other techniques. Copyright © 2012 International Society for Advancement of Cytometry.

  4. Computer-based assessment of right ventricular regional ejection fraction in patients with repaired Tetralogy of Fallot

    NASA Astrophysics Data System (ADS)

    Teo, S.-K.; Wong, S. T.; Tan, M. L.; Su, Y.; Zhong, L.; Tan, Ru-San

    2015-03-01

    After surgical repair for Tetralogy of Fallot (TOF), most patients experience long-term complications as the right ventricle (RV) undergoes progressive remodeling that eventually affect heart functions. Thus, post-repair surgery is required to prevent further deterioration of RV functions that may result in malignant ventricular arrhythmias and mortality. The timing of such post-repair surgery therefore depends crucially on the quantitative assessment of the RV functions. Current clinical indices for such functional assessment measure global properties such as RV volumes and ejection fraction. However, these indices are less than ideal as regional variations and anomalies are obscured. Therefore, we sought to (i) develop a quantitative method to assess RV regional function using regional ejection fraction (REF) based on a 13-segment model, and (ii) evaluate the effectiveness of REF in discriminating 6 repaired TOF patients and 6 normal control based on cardiac magnetic resonance (CMR) imaging. We observed that the REF for the individual segments in the patient group is significantly lower compared to the control group (P < 0.05 using a 2-tail student t-test). In addition, we also observed that the aggregated REF at the basal, mid-cavity and apical regions for the patient group is significantly lower compared to the control group (P < 0.001 using a 2-tail student t-test). The results suggest that REF could potentially be used as a quantitative index for assessing RV regional functions. The computational time per data set is approximately 60 seconds, which demonstrates our method's clinical potential as a real-time cardiac assessment tool.

  5. Diagnostic testing for pandemic influenza in Singapore: a novel dual-gene quantitative real-time RT-PCR for the detection of influenza A/H1N1/2009.

    PubMed

    Lee, Hong Kai; Lee, Chun Kiat; Loh, Tze Ping; Tang, Julian Wei-Tze; Chiu, Lily; Tambyah, Paul A; Sethi, Sunil K; Koay, Evelyn Siew-Chuan

    2010-09-01

    With the relative global lack of immunity to the pandemic influenza A/H1N1/2009 virus that emerged in April 2009 as well as the sustained susceptibility to infection, rapid and accurate diagnostic assays are essential to detect this novel influenza A variant. Among the molecular diagnostic methods that have been developed to date, most are in tandem monoplex assays targeting either different regions of a single viral gene segment or different viral gene segments. We describe a dual-gene (duplex) quantitative real-time RT-PCR method selectively targeting pandemic influenza A/H1N1/2009. The assay design includes a primer-probe set specific to only the hemagglutinin (HA) gene of this novel influenza A variant and a second set capable of detecting the nucleoprotein (NP) gene of all swine-origin influenza A virus. In silico analysis of the specific HA oligonucleotide sequence used in the assay showed that it targeted only the swine-origin pandemic strain; there was also no cross-reactivity against a wide spectrum of noninfluenza respiratory viruses. The assay has a diagnostic sensitivity and specificity of 97.7% and 100%, respectively, a lower detection limit of 50 viral gene copies/PCR, and can be adapted to either a qualitative or quantitative mode. It was first applied to 3512 patients with influenza-like illnesses at a tertiary hospital in Singapore, during the containment phase of the pandemic (May to July 2009).

  6. Seasonal variation of food security among the Batwa of Kanungu, Uganda.

    PubMed

    Patterson, Kaitlin; Berrang-Ford, Lea; Lwasa, Shuaib; Namanya, Didacus B; Ford, James; Twebaze, Fortunate; Clark, Sierra; Donnelly, Blánaid; Harper, Sherilee L

    2017-01-01

    Climate change is projected to increase the burden of food insecurity (FI) globally, particularly among populations that depend on subsistence agriculture. The impacts of climate change will have disproportionate effects on populations with higher existing vulnerability. Indigenous people consistently experience higher levels of FI than their non-Indigenous counterparts and are more likely to be dependent upon land-based resources. The present study aimed to understand the sensitivity of the food system of an Indigenous African population, the Batwa of Kanungu District, Uganda, to seasonal variation. A concurrent, mixed methods (quantitative and qualitative) design was used. Six cross-sectional retrospective surveys, conducted between January 2013 and April 2014, provided quantitative data to examine the seasonal variation of self-reported household FI. This was complemented by qualitative data from focus group discussions and semi-structured interviews collected between June and August 2014. Ten rural Indigenous communities in Kanungu District, Uganda. FI data were collected from 130 Indigenous Batwa Pygmy households. Qualitative methods involved Batwa community members, local key informants, health workers and governmental representatives. The dry season was associated with increased FI among the Batwa in the quantitative surveys and in the qualitative interviews. During the dry season, the majority of Batwa households reported greater difficulty in acquiring sufficient quantities and quality of food. However, the qualitative data indicated that the effect of seasonal variation on FI was modified by employment, wealth and community location. These findings highlight the role social factors play in mediating seasonal impacts on FI and support calls to treat climate associations with health outcomes as non-stationary and mediated by social sensitivity.

  7. Evaluation of normalization methods in mammalian microRNA-Seq data

    PubMed Central

    Garmire, Lana Xia; Subramaniam, Shankar

    2012-01-01

    Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701

  8. Infectious helminth ova in wastewater and sludge: A review on public health issues and current quantification practices.

    PubMed

    Gyawali, P

    2018-02-01

    Raw and partially treated wastewater has been widely used to maintain the global water demand. Presence of viable helminth ova and larvae in the wastewater raised significant public health concern especially when used for agriculture and aquaculture. Depending on the prevalence of helminth infections in communities, up to 1.0 × 10 3 ova/larvae can be presented per litre of wastewater and 4 gm (dry weight) of sludge. Multi-barrier approaches including pathogen reduction, risk assessment, and exposure reduction have been suggested by health regulators to minimise the potential health risk. However, with a lack of a sensitive and specific method for the quantitative detection of viable helminth ova from wastewater, an accurate health risk assessment is difficult to achieve. As a result, helminth infections are difficult to control from the communities despite two decades of global effort (mass drug administration). Molecular methods can be more sensitive and specific than currently adapted culture-based and vital stain methods. The molecular methods, however, required more and thorough investigation for its ability with accurate quantification of viable helminth ova/larvae from wastewater and sludge samples. Understanding different cell stages and corresponding gene copy numbers is pivotal for accurate quantification of helminth ova/larvae in wastewater samples. Identifying specific genetic markers including protein, lipid, and metabolites using multiomics approach could be utilized for cheap, rapid, sensitive, specific and point of care detection tools for helminth ova and larva in the wastewater.

  9. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  10. A simulation of orientation dependent, global changes in camera sensitivity in ECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bieszk, J.A.; Hawman, E.G.; Malmin, R.E.

    1984-01-01

    ECT promises the abilities to: 1) observe radioisotope distributions in a patient without the summation of overlying activity to reduce contrast, and 2) measure quantitatively these distributions to further and more accurately assess organ function. Ideally, camera-based ECT systems should have a performance that is independent of camera orientation or gantry angle. This study is concerned with ECT quantitation errors that can arise from angle-dependent variations of camera sensitivity. Using simulated phantoms representative of heart and liver sections, the effects of sensitivity changes on reconstructed images were assessed both visually and quantitatively based on ROI sums. The sinogram for eachmore » test image was simulated with 128 linear digitization and 180 angular views. The global orientation-dependent sensitivity was modelled by applying an angular sensitivity dependence to the sinograms of the test images. Four sensitivity variations were studied. Amplitudes of 0% (as a reference), 5%, 10%, and 25% with a costheta dependence were studied as well as a cos2theta dependence with a 5% amplitude. Simulations were done with and without Poisson noise to: 1) determine trends in the quantitative effects as a function of the magnitude of the variation, and 2) to see how these effects are manifested in studies having statistics comparable to clinical cases. For the most realistic sensitivity variation (costheta, 5% ampl.), the ROIs chosen in the present work indicated changes of <0.5% in the noiseless case and <5% for the case with Poisson noise. The effects of statistics appear to dominate any effects due to global, sinusoidal, orientation-dependent sensitivity changes in the cases studied.« less

  11. Global Epidemiology of Amyotrophic Lateral Sclerosis: a Systematic Review of the Published Literature

    PubMed Central

    Chiò, A; Logroscino, G; Traynor, BJ; Collins, J; Simeone, JC; Goldstein, LA; White, LA

    2014-01-01

    Background Amyotrophic lateral sclerosis (ALS) is relatively rare, yet the economic and social burden is substantial. Having accurate incidence and prevalence estimates would facilitate efficient allocation of healthcare resources. Objective To provide a comprehensive and critical review of the epidemiologic literature on ALS. Methods MEDLINE and EMBASE (1995–2011) databases of population-based studies on ALS incidence and prevalence reporting quantitative data were analyzed. Data extracted included study location and time, design and data sources, case ascertainment methods, and incidence and/or prevalence rates. Medians and inter-quartile ranges (IQRs) were calculated, and ALS case estimates derived using 2010 population estimates. Results In all, 37 articles met inclusion criteria. In Europe, the median (IQR) incidence rate (/100,000 population) was 2.08 (1.47–2.43), corresponding to an estimated 15,355 (10,852–17,938) cases. Median (IQR) prevalence (/100,000 population) was 5.40 (4.06–7.89), or 39,863 (29,971–58,244) prevalent cases. Conclusions Disparity in rates among ALS incidence and prevalence studies may be due to differences in study design or true variations in population demographics, such as age, and geography, including environmental factors and genetic predisposition. Additional large-scale studies that use standardized case ascertainment methods are needed to more accurately assess the true global burden of ALS. PMID:23860588

  12. Fast Measurement and Reconstruction of Large Workpieces with Freeform Surfaces by Combining Local Scanning and Global Position Data

    PubMed Central

    Chen, Zhe; Zhang, Fumin; Qu, Xinghua; Liang, Baoqiu

    2015-01-01

    In this paper, we propose a new approach for the measurement and reconstruction of large workpieces with freeform surfaces. The system consists of a handheld laser scanning sensor and a position sensor. The laser scanning sensor is used to acquire the surface and geometry information, and the position sensor is utilized to unify the scanning sensors into a global coordinate system. The measurement process includes data collection, multi-sensor data fusion and surface reconstruction. With the multi-sensor data fusion, errors accumulated during the image alignment and registration process are minimized, and the measuring precision is significantly improved. After the dense accurate acquisition of the three-dimensional (3-D) coordinates, the surface is reconstructed using a commercial software piece, based on the Non-Uniform Rational B-Splines (NURBS) surface. The system has been evaluated, both qualitatively and quantitatively, using reference measurements provided by a commercial laser scanning sensor. The method has been applied for the reconstruction of a large gear rim and the accuracy is up to 0.0963 mm. The results prove that this new combined method is promising for measuring and reconstructing the large-scale objects with complex surface geometry. Compared with reported methods of large-scale shape measurement, it owns high freedom in motion, high precision and high measurement speed in a wide measurement range. PMID:26091396

  13. The community structure of the global corporate network.

    PubMed

    Vitali, Stefania; Battiston, Stefano

    2014-01-01

    We investigate the community structure of the global ownership network of transnational corporations. We find a pronounced organization in communities that cannot be explained by randomness. Despite the global character of this network, communities reflect first of all the geographical location of firms, while the industrial sector plays only a marginal role. We also analyze the meta-network in which the nodes are the communities and the links are obtained by aggregating the links among firms belonging to pairs of communities. We analyze the network centrality of the top 50 communities and we provide a quantitative assessment of the financial sector role in connecting the global economy.

  14. The Community Structure of the Global Corporate Network

    PubMed Central

    Vitali, Stefania; Battiston, Stefano

    2014-01-01

    We investigate the community structure of the global ownership network of transnational corporations. We find a pronounced organization in communities that cannot be explained by randomness. Despite the global character of this network, communities reflect first of all the geographical location of firms, while the industrial sector plays only a marginal role. We also analyze the meta-network in which the nodes are the communities and the links are obtained by aggregating the links among firms belonging to pairs of communities. We analyze the network centrality of the top 50 communities and we provide a quantitative assessment of the financial sector role in connecting the global economy. PMID:25126722

  15. A mixed methods evaluation of an international service learning program in the Dominican Republic.

    PubMed

    Curtin, Alicia J; Martins, Diane C; Schwartz-Barcott, Donna

    2015-01-01

    To examine the impact of an international service learning experience (ISL) using a quantitative and qualitative approach. A descriptive study was used to explore the impact of an ISL experience on global awareness, professional and personal growth with 11 baccalaureate nursing students in the Dominican Republic. Students participated in a three credit ISL program in the Dominican Republic which included pre- and postexperience seminars and a 2-week, on-site immersion experience. The International Education Survey (IES) was used as the quantitative measure. Content analysis of Critical Reflective Inquiry (CRI) narratives was used as the qualitative method. Students reported a high overall impact (M = 5.9) using the IES with high means for the Professional Student Nurse Role (M = 6.10, SD: 0.74), Personal Development (M = 6.08, SD: 0.76), International Perspectives (M = 6.03, SD: 0.71), and a lower mean for Intellectual Development (M = 5.40, SD: 0.69). CRI narratives revealed specific areas of impact, for example, increased empathy and ability to communicate effectively with patients from life situations very different from their own. Further exploration of the usefulness of various evaluation tools and methodological designs is warranted to understand this type of pedagogy and its' impact on student learning outcomes short- and long-term. © 2014 Wiley Periodicals, Inc.

  16. PET Image Reconstruction Incorporating 3D Mean-Median Sinogram Filtering

    NASA Astrophysics Data System (ADS)

    Mokri, S. S.; Saripan, M. I.; Rahni, A. A. Abd; Nordin, A. J.; Hashim, S.; Marhaban, M. H.

    2016-02-01

    Positron Emission Tomography (PET) projection data or sinogram contained poor statistics and randomness that produced noisy PET images. In order to improve the PET image, we proposed an implementation of pre-reconstruction sinogram filtering based on 3D mean-median filter. The proposed filter is designed based on three aims; to minimise angular blurring artifacts, to smooth flat region and to preserve the edges in the reconstructed PET image. The performance of the pre-reconstruction sinogram filter prior to three established reconstruction methods namely filtered-backprojection (FBP), Maximum likelihood expectation maximization-Ordered Subset (OSEM) and OSEM with median root prior (OSEM-MRP) is investigated using simulated NCAT phantom PET sinogram as generated by the PET Analytical Simulator (ASIM). The improvement on the quality of the reconstructed images with and without sinogram filtering is assessed according to visual as well as quantitative evaluation based on global signal to noise ratio (SNR), local SNR, contrast to noise ratio (CNR) and edge preservation capability. Further analysis on the achieved improvement is also carried out specific to iterative OSEM and OSEM-MRP reconstruction methods with and without pre-reconstruction filtering in terms of contrast recovery curve (CRC) versus noise trade off, normalised mean square error versus iteration, local CNR versus iteration and lesion detectability. Overall, satisfactory results are obtained from both visual and quantitative evaluations.

  17. Lipidomics by ultrahigh performance liquid chromatography-high resolution mass spectrometry and its application to complex biological samples.

    PubMed

    Triebl, Alexander; Trötzmüller, Martin; Hartler, Jürgen; Stojakovic, Tatjana; Köfeler, Harald C

    2017-05-15

    An improved approach for selective and sensitive identification and quantitation of lipid molecular species using reversed phase chromatography coupled to high resolution mass spectrometry was developed. The method is applicable to a wide variety of biological matrices using a simple liquid-liquid extraction procedure. Together, this approach combines multiple selectivity criteria: Reversed phase chromatography separates lipids according to their acyl chain length and degree of unsaturation and is capable of resolving positional isomers of lysophospholipids, as well as structural isomers of diacyl phospholipids and glycerolipids. Orbitrap mass spectrometry delivers the elemental composition of both positive and negative ions with high mass accuracy. Finally, automatically generated tandem mass spectra provide structural insight into numerous glycerolipids, phospholipids, and sphingolipids within a single run. Calibration showed linearity ranges of more than four orders of magnitude, good values for accuracy and precision at biologically relevant concentration levels, and limits of quantitation of a few femtomoles on column. Hundreds of lipid molecular species were detected and quantified in three different biological matrices, which cover well the wide variety and complexity of various model organisms in lipidomic research. Together with a software package, this method is a prime choice for global lipidomic analysis of even the most complex biological samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Impact of Hydrogeological Uncertainty on Estimation of Environmental Risks Posed by Hydrocarbon Transportation Networks

    NASA Astrophysics Data System (ADS)

    Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.

    2017-11-01

    Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.

  19. Holocene Changes in Land Cover and Greenhouse-gas Concentrations: Rethinking Natural vs Anthropogenic Causation

    NASA Astrophysics Data System (ADS)

    Roberts, C.

    2008-12-01

    The Holocene has witnessed a switch from a nature-dominated to a human-dominated Earth system. Although globally-significant human impacts (wildfire, megafaunal extinctions) occurred during the late Pleistocene, it was the advent of agriculture that led to the progressive transformation of land cover, and which distinguishes the Holocene from previous interglacial periods. A wide array of data provide clear evidence of local-to-regional human disturbance from ~5 ka BP, in some cases earlier. There is more uncertainty about when the anthropogenic "footprint" became detectable at a global scale, and there has consequently been debate about how much of the pre-industrial increase in atmospheric greenhouse gas concentrations is attributable to human causation, linked to processes such as deforestation (CO2) and wet rice cultivation (CH4). Although there has been recent progress in developing quantitative methods for translating pollen data into palaeo-land cover, such as the REVEALS model of Sugita (Holocene 2007) coupled to GIS, this has yet to be widely applied to existing data bases, and most pollen-based land-use reconstructions remain qualitative or semi-quantitative. Lake trophic status, sediment flux / soil erosion, and microcharcoal records of biomass burning provide alternative proxies that integrate regional-scale landscape disturbance. These proxy data along with documentary sources imply that globally-significant changes in land cover occurred prior to ~250 BP which must have altered atmospheric greenhouse gas concentrations by this time. The polarised debate for and against early anthropogenic impact on global carbon cycling mirrors our industrial-era division between nature and society, both conceptually (e.g. Cartesian dualism) and on the ground (e.g. demarcating land between monoculture agriculture and wilderness). However, for the period before ~1750 AD, this likely represents a false dichotomy, because pre-industrial societies more often formed part of the natural world, while at the same time modifying and transforming it. Attempts to partition carbon emissions between natural and anthropogenic sources during the Holocene may therefore be misplaced. Many landscapes, such as savannas, are the result of synergistic - and in some cases contingent - relationships between people, other animals, plants and other components of nature. The issue is thus not whether early humans altered carbon cycling (they did), but rather at what point it became detectable at a global scale, and what form it took.

  20. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.

  1. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels

    PubMed Central

    van der Laan, Mark J.; Hubbard, Alan E.; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L.; Moss, Delynn M.; Nutman, Thomas B.; Priest, Jeffrey W.; Lammie, Patrick J.

    2017-01-01

    Background Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. Methods/Principal findings We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman’s rho = 0.75). In both high- and low transmission settings, mean antibody curves revealed changes in population mean antibody levels that were masked by seroprevalence measures because changes took place above or below the seropositivity cutoff. Conclusions/Significance Age-dependent antibody curves and summary means provided a robust and sensitive measure of changes in transmission, with greatest sensitivity among young children. The method generalizes to pathogens that can be measured in high-throughput, multiplex serological assays, and scales to surveillance activities that require high spatiotemporal resolution. Our results suggest quantitative antibody levels will be particularly useful to measure differences in exposure for pathogens that elicit a transient antibody response or for monitoring populations with very high- or very low transmission, when seroprevalence is less informative. The approach represents a new opportunity to conduct integrated serological surveillance for neglected tropical diseases, malaria, and other infectious diseases with well-defined antigen targets. PMID:28542223

  2. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    USGS Publications Warehouse

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted conservation plans can be developed to support planning for habitat management and evaluation of long-term climate effects.

  3. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    PubMed

    Beatty, William S; Kesler, Dylan C; Webb, Elisabeth B; Raedeke, Andrew H; Naylor, Luke W; Humburg, Dale D

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted conservation plans can be developed to support planning for habitat management and evaluation of long-term climate effects.

  4. Qualitative and Quantitative Research on the Knowledge of Global Issues, International Attitudes, and Skills of Negotiation among Secondary School Students.

    ERIC Educational Resources Information Center

    Torney-Purta, Judith

    For decades, there has been considerable interest in understanding how attitudes toward other nations and cultures are formed and how people differ in their knowledge and attitudes toward global issues. However, the available research on the topic is sparse and disconnected. This paper suggests a model for looking at the research and presents…

  5. Local Patterns to Global Architectures: Influences of Network Topology on Human Learning.

    PubMed

    Karuza, Elisabeth A; Thompson-Schill, Sharon L; Bassett, Danielle S

    2016-08-01

    A core question in cognitive science concerns how humans acquire and represent knowledge about their environments. To this end, quantitative theories of learning processes have been formalized in an attempt to explain and predict changes in brain and behavior. We connect here statistical learning approaches in cognitive science, which are rooted in the sensitivity of learners to local distributional regularities, and network science approaches to characterizing global patterns and their emergent properties. We focus on innovative work that describes how learning is influenced by the topological properties underlying sensory input. The confluence of these theoretical approaches and this recent empirical evidence motivate the importance of scaling-up quantitative approaches to learning at both the behavioral and neural levels. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Review and Future Research Directions about Major Monitoring Method of Soil Erosion

    NASA Astrophysics Data System (ADS)

    LI, Yue; Bai, Xiaoyong; Tian, Yichao; Luo, Guangjie

    2017-05-01

    Soil erosion is a highly serious ecological problem that occurs worldwide. Hence,scientific methods for accurate monitoring are needed to obtain soil erosion data. At present,numerous methods on soil erosion monitoring are being used internationally. In this paper, wepresent a systematic classification of these methods based on the date of establishment andtype of approach. This classification comprises five categories: runoff plot method, erosion pinmethod, radionuclide tracer method, model estimation, and 3S technology combined method.The backgrounds of their establishment are briefly introduced, the history of their developmentis reviewed, and the conditions for their application are enumerated. Their respectiveadvantages and disadvantages are compared and analysed, and future prospects regarding theirdevelopment are discussed. We conclude that the methods of soil erosion monitoring in the past 100 years of their development constantly considered the needs of the time. According to the progress of soil erosion monitoring technology throughout its history, we predict that the future trend in this field would move toward the development of quantitative, precise, and composite methods. This report serves as a valuable reference for scientific and technological workers globally, especially those engaged in soil erosion research.

  7. Epidemic spreading on adaptively weighted scale-free networks.

    PubMed

    Sun, Mengfeng; Zhang, Haifeng; Kang, Huiyan; Zhu, Guanghu; Fu, Xinchu

    2017-04-01

    We introduce three modified SIS models on scale-free networks that take into account variable population size, nonlinear infectivity, adaptive weights, behavior inertia and time delay, so as to better characterize the actual spread of epidemics. We develop new mathematical methods and techniques to study the dynamics of the models, including the basic reproduction number, and the global asymptotic stability of the disease-free and endemic equilibria. We show the disease-free equilibrium cannot undergo a Hopf bifurcation. We further analyze the effects of local information of diseases and various immunization schemes on epidemic dynamics. We also perform some stochastic network simulations which yield quantitative agreement with the deterministic mean-field approach.

  8. Systems genetics approaches to understand complex traits

    PubMed Central

    Civelek, Mete; Lusis, Aldons J.

    2014-01-01

    Systems genetics is an approach to understand the flow of biological information that underlies complex traits. It uses a range of experimental and statistical methods to quantitate and integrate intermediate phenotypes, such as transcript, protein or metabolite levels, in populations that vary for traits of interest. Systems genetics studies have provided the first global view of the molecular architecture of complex traits and are useful for the identification of genes, pathways and networks that underlie common human diseases. Given the urgent need to understand how the thousands of loci that have been identified in genome-wide association studies contribute to disease susceptibility, systems genetics is likely to become an increasingly important approach to understanding both biology and disease. PMID:24296534

  9. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  10. Synthesizing qualitative and quantitative evidence on non-financial access barriers: implications for assessment at the district level.

    PubMed

    O'Connell, Thomas S; Bedford, K Juliet A; Thiede, Michael; McIntyre, Di

    2015-06-09

    A key element of the global drive to universal health coverage is ensuring access to needed health services for everyone, and to pursue this goal in an equitable way. This requires concerted efforts to reduce disparities in access through understanding and acting on barriers facing communities with the lowest utilisation levels. Financial barriers dominate the empirical literature on health service access. Unless the full range of access barriers are investigated, efforts to promote equitable access to health care are unlikely to succeed. This paper therefore focuses on exploring the nature and extent of non-financial access barriers. We draw upon two structured literature reviews on barriers to access and utilization of maternal, newborn and child health services in Ghana, Bangladesh, Vietnam and Rwanda. One review analyses access barriers identified in published literature using qualitative research methods; the other in published literature using quantitative analysis of household survey data. We then synthesised the key qualitative and quantitative findings through a conjoint iterative analysis. Five dominant themes on non-financial access barriers were identified: ethnicity; religion; physical accessibility; decision-making, gender and autonomy; and knowledge, information and education. The analysis highlighted that non-financial factors pose considerable barriers to access, many of which relate to the acceptability dimension of access and are challenging to address. Another key finding is that quantitative research methods, while yielding important findings, are inadequate for understanding non-financial access barriers in sufficient detail to develop effective responses. Qualitative research is critical in filling this gap. The analysis also indicates that the nature of non-financial access barriers vary considerably, not only between countries but also between different communities within individual countries. To adequately understand access barriers as a basis for developing effective strategies to address them, mixed-methods approaches are required. From an equity perspective, communities with the lowest utilisation levels should be prioritised and the access barriers specific to that community identified. It is, therefore, critical to develop approaches that can be used at the district level to diagnose and act upon access barriers if we are to pursue an equitable path to universal health coverage.

  11. Comparison of High Resolution Quantitative Extreme Precipitation Estimation from GPM Dual-frequency Radar and S-band Radar Observation over Southern China

    NASA Astrophysics Data System (ADS)

    Zhang, A.; Chen, S.; Fan, S.; Min, C.

    2017-12-01

    Precipitation is one of the basic elements of regional and global climate change. Not only does the precipitation have a great impact on the earth's hydrosphere, but also plays a crucial role in the global energy balance. S-band ground-based dual-polarization radar has the excellent performance of identifying the different phase states of precipitation, which can dramatically improve the accuracy of hail identification and quantitative precipitation estimation (QPE). However, the ground-based radar cannot measure the precipitation in mountains, sparsely populated plateau, desert and ocean because of the ground-based radar void. The Unites States National Aeronautics and Space Administration (NASA) and Japan Aerospace Exploration Agency (JAXA) have launched the Global Precipitation Measurement (GPM) for almost three years. GPM is equipped with a GPM Microwave Imager (GMI) and a Dual-frequency (Ku- and Ka-band) Precipitation Radar (DPR) that covers the globe between 65°S and 65°N. The main parameters and the detection method of DPR are different from those of ground-based radars, thus, the DPR's reliability and capability need to be investigated and evaluated by the ground-based radar. This study compares precipitation derived from the ground-based radar measurement to that derived from the DPR's observations. The ground-based radar is a S-band dual-polarization radar deployed near an airport in the west of Zhuhai city. The ground-based quantitative precipitation estimates are with a high resolution of 1km×1km×6min. It shows that this radar covers the whole Pearl River Delta of China, including Hong Kong and Macao. In order to quantify the DPR precipitation quantification capabilities relative to the S-band radar, statistical metrics used in this study are as follows: the difference (Dif) between DPR and the S-band radar observation, root-mean-squared error (RMSE) and correlation coefficient (CC). Additionally, Probability of Detection (POD) and False Alarm Ratio (FAR) are used to further evaluate the rainfall capacity of the DPR. The comparisons performed between the DPR and the S-band radar are expected to provide a useful reference not only for algorithm developers but also the end users in hydrology, ecology, weather forecast service and so on.

  12. Chronic Obstructive Pulmonary Disease Exacerbations in the COPDGene Study: Associated Radiologic Phenotypes

    PubMed Central

    Kazerooni, Ella A.; Lynch, David A.; Liu, Lyrica X.; Murray, Susan; Curtis, Jeffrey L.; Criner, Gerard J.; Kim, Victor; Bowler, Russell P.; Hanania, Nicola A.; Anzueto, Antonio R.; Make, Barry J.; Hokanson, John E.; Crapo, James D.; Silverman, Edwin K.; Martinez, Fernando J.; Washko, George R.

    2011-01-01

    Purpose: To test the hypothesis—given the increasing emphasis on quantitative computed tomographic (CT) phenotypes of chronic obstructive pulmonary disease (COPD)—that a relationship exists between COPD exacerbation frequency and quantitative CT measures of emphysema and airway disease. Materials and Methods: This research protocol was approved by the institutional review board of each participating institution, and all participants provided written informed consent. One thousand two subjects who were enrolled in the COPDGene Study and met the GOLD (Global Initiative for Chronic Obstructive Lung Disease) criteria for COPD with quantitative CT analysis were included. Total lung emphysema percentage was measured by using the attenuation mask technique with a −950-HU threshold. An automated program measured the mean wall thickness and mean wall area percentage in six segmental bronchi. The frequency of COPD exacerbation in the prior year was determined by using a questionnaire. Statistical analysis was performed to examine the relationship of exacerbation frequency with lung function and quantitative CT measurements. Results: In a multivariate analysis adjusted for lung function, bronchial wall thickness and total lung emphysema percentage were associated with COPD exacerbation frequency. Each 1-mm increase in bronchial wall thickness was associated with a 1.84-fold increase in annual exacerbation rate (P = .004). For patients with 35% or greater total emphysema, each 5% increase in emphysema was associated with a 1.18-fold increase in this rate (P = .047). Conclusion: Greater lung emphysema and airway wall thickness were associated with COPD exacerbations, independent of the severity of airflow obstruction. Quantitative CT can help identify subgroups of patients with COPD who experience exacerbations for targeted research and therapy development for individual phenotypes. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110173/-/DC1 PMID:21788524

  13. Making predictions of mangrove deforestation: a comparison of two methods in Kenya.

    PubMed

    Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A

    2013-11-01

    Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.

  14. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  15. Long Term Baseline Atmospheric Monitoring

    ERIC Educational Resources Information Center

    Goldman, Mark A.

    1975-01-01

    Describes a program designed to measure the normal concentrations of certain chemical and physical parameters of the atmosphere so that quantitative estimates can be made of local, regional, and global pollution. (GS)

  16. Are globals for health, well-being and quality of life interchangeable? A mixed methods study in ankylosing spondylitis patients and controls.

    PubMed

    van Tubergen, Astrid; Gulpen, Anouk; Landewé, Robert; Boonen, Annelies

    2018-05-19

    Patients' experience of overall health is often assessed through a single-item global question. Here, we evaluated among patients with AS and population controls whether single-item questions on the constructs health, well-being and quality of life (QoL) are interchangeable. In a mixed quantitative and qualitative approach, all subjects scored the three single-item globals on a numeric rating scale (0-10, best). Next, they indicated for each of the questions which aspects they had been considering when scoring. After forced reflection, globals were scored again. Dissimilarities in scores among constructs, between patients and controls, and before or after reflection were tested using mixed linear models. Themes identified per construct in the qualitative part were linked to the International Classification of Functioning, Disability and Health. The type of themes per construct was compared between patients and controls. Sixty-eight AS patients and 84 controls completed the questionnaire. Patients scored significantly worse on each global than controls (mean 6.1-6.3 vs 7.2-7.6, all P < 0.01). Within groups, however, no significant differences in scores on each construct, or in scores before or after forced reflection were found. Health-related themes were relevant to each construct for patients, but were less relevant for controls when considering well-being and QoL. Emotional functions were relevant to well-being in all participants. Social roles and financial situation were more frequently related to well-being and QoL in controls. While patients and controls identified content-related dissimilarities between the three constructs studied, this was not reflected in different scores of the globals.

  17. Quantitative Proteomics of Sleep-Deprived Mouse Brains Reveals Global Changes in Mitochondrial Proteins

    PubMed Central

    Li, Tie-Mei; Zhang, Ju-en; Lin, Rui; Chen, She; Luo, Minmin; Dong, Meng-Qiu

    2016-01-01

    Sleep is a ubiquitous, tightly regulated, and evolutionarily conserved behavior observed in almost all animals. Prolonged sleep deprivation can be fatal, indicating that sleep is a physiological necessity. However, little is known about its core function. To gain insight into this mystery, we used advanced quantitative proteomics technology to survey the global changes in brain protein abundance. Aiming to gain a comprehensive profile, our proteomics workflow included filter-aided sample preparation (FASP), which increased the coverage of membrane proteins; tandem mass tag (TMT) labeling, for relative quantitation; and high resolution, high mass accuracy, high throughput mass spectrometry (MS). In total, we obtained the relative abundance ratios of 9888 proteins encoded by 6070 genes. Interestingly, we observed significant enrichment for mitochondrial proteins among the differentially expressed proteins. This finding suggests that sleep deprivation strongly affects signaling pathways that govern either energy metabolism or responses to mitochondrial stress. Additionally, the differentially-expressed proteins are enriched in pathways implicated in age-dependent neurodegenerative diseases, including Parkinson’s, Huntington’s, and Alzheimer’s, hinting at possible connections between sleep loss, mitochondrial stress, and neurodegeneration. PMID:27684481

  18. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  19. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    PubMed Central

    Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine

    2015-01-01

    Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402

  20. Rapid comparison of properties on protein surface

    PubMed Central

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-01-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM β/α barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure. PMID:18618695

  1. Rapid comparison of properties on protein surface.

    PubMed

    Sael, Lee; La, David; Li, Bin; Rustamov, Raif; Kihara, Daisuke

    2008-10-01

    The mapping of physicochemical characteristics onto the surface of a protein provides crucial insights into its function and evolution. This information can be further used in the characterization and identification of similarities within protein surface regions. We propose a novel method which quantitatively compares global and local properties on the protein surface. We have tested the method on comparison of electrostatic potentials and hydrophobicity. The method is based on 3D Zernike descriptors, which provides a compact representation of a given property defined on a protein surface. Compactness and rotational invariance of this descriptor enable fast comparison suitable for database searches. The usefulness of this method is exemplified by studying several protein families including globins, thermophilic and mesophilic proteins, and active sites of TIM beta/alpha barrel proteins. In all the cases studied, the descriptor is able to cluster proteins into functionally relevant groups. The proposed approach can also be easily extended to other surface properties. This protein surface-based approach will add a new way of viewing and comparing proteins to conventional methods, which compare proteins in terms of their primary sequence or tertiary structure.

  2. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less

  3. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    DOE PAGES

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-29

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable–region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observationalmore » dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.« less

  4. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    NASA Astrophysics Data System (ADS)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-01

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable-region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observational dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is unlikely that a single evaluation measure for all variables in an IAM exists, and therefore sector-by-sector evaluation may be necessary.

  5. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  6. Quantitative analysis of ecological effects for land use planning based on ecological footprint method: a case research in Nanyang City

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Liu, Yaolin; Chen, Xinming

    2008-10-01

    The research of coordinated development between land use and ecological building is a new problem with the development of country economy, whose intention is to improve economy development and protect eco-environment in order to realize regional sustainable development. Evaluating human effects on the ecosystem by a comprehensive, scientific and quantitative method is a critical issue in the process of general land use planning. At present, ecological footprint methodology, as an excellent educational tool applicable to global issues, is essential for quantifying humanity's consumption of natural capital, for overall assessments of human impact on earth as well as for general land use planning. However, quantitative studies on the development trends of ecological footprint (EF) time series and biological capacity (BC) time series in a given region are still rare. Taking Nanyang City as a case study, this paper presents two quantitative estimate indices over time scale called the change rate and scissors difference to quantitatively analyze the trends of EF and BC over the planning period in general land use planning form 1997-2004 and to evaluate the ecological effects of the land use general planning form 1997 to.2010. The results showed that: 1 In Nanyang city, trends of the per capita EF and BC were on the way round, and the ecological deficit enhanced from 1997 to 2010. 2 The difference between the two development trends of per capita EF and BC had been increasing rapidly and the conflict between the EF and BC was aggravated from 1997 to 2010. 3 The general land use planning (1997 - 2010) of Nanyang city had produced some positive effects on the local ecosystem, but the expected biological capacity in 2010 can hardly be realized following this trend. Therefore, this paper introduces a "trinity" land use model in the guidelines of environment- friendly land use pattern and based on the actual situation of Nanyang city, with the systemic synthesis of land utilization of the cities, the village and the suburb as a principal part and the land development reorganization and the ecological environment construction as the key point.

  7. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  8. Oceanic Fluxes of Mass, Heat and Freshwater: A Global Estimate and Perspective

    NASA Technical Reports Server (NTRS)

    MacDonald, Alison Marguerite

    1995-01-01

    Data from fifteen globally distributed, modern, high resolution, hydrographic oceanic transects are combined in an inverse calculation using large scale box models. The models provide estimates of the global meridional heat and freshwater budgets and are used to examine the sensitivity of the global circulation, both inter and intra-basin exchange rates, to a variety of external constraints provided by estimates of Ekman, boundary current and throughflow transports. A solution is found which is consistent with both the model physics and the global data set, despite a twenty five year time span and a lack of seasonal consistency among the data. The overall pattern of the global circulation suggested by the models is similar to that proposed in previously published local studies and regional reviews. However, significant qualitative and quantitative differences exist. These differences are due both to the model definition and to the global nature of the data set.

  9. A trend analysis of surgical operations under a global payment system in Tehran, Iran (2005–2015)

    PubMed Central

    Goudari, Faranak Behzadi; Rashidian, Arash; Arab, Mohammad; Mahmoudi, Mahmood

    2018-01-01

    Background Global payment system is a first example of per-case payment system that contains 60 commonly used surgical operations for which payment is based on the average cost per case in Iran. Objective The aim of the study was to determine the amount of reduction, increase or no change in the trend of global operations. Methods In this retrospective longitudinal study, data on the 60 primary global surgery codes was gathered from Tehran Health Insurance Organization within the ten-year period of 2005–2015 separately, for each month. Out of 60 surgery codes, only acceptable data for 46 codes were available based on the insurance documents sent by medical centers. A quantitative analysis of time series through Regression Analysis Model using STATA software v.11 was performed. Results Some global surgery codes had an upward trend and some were downwards. Of N Codes, N83, N20, N28, N63, and N93 had an upward trend (p<0.05) and N32, N43, N81 and N90 showed a significant downward trend (p<0.05). Similarly, all H Codes except for H18 had a significant upward trend (p<0.000). As such, K Codes including K45, K56 and K81 had an increasing movement. S Codes also experienced both increasing and decreasing trends. However, none of the O Codes changed according to time. Other global surgical codes like C61, E07, M51, L60, J98 (p<0.000), I84 (p<0.031) and I86 (p<0.000) shown upward and downward trends. Total global surgeries trend was significantly upwards (B=24.26109, p<0.000). Conclusion The varying trend of global surgeries can partly reflect the behavior of service providers in order to increase their profits and minimize their costs. PMID:29765576

  10. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    NASA Astrophysics Data System (ADS)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables such the Green Area Index (GAI), fAPAR and fcover usually retrieved from MODIS, MERIS, SPOT-Vegetation described the quality of the green vegetation development. The GLOBAM (Belgium) and EU FP-7 MOCCCASIN projects (Russia) improved the standard products and were demonstrated over large scale. The GAI retrieved from MODIS time series using a purity index criterion depicted successfully the inter-annual variability. Furthermore, the quantitative assimilation of these GAI time series into a crop growth model improved the yield estimate over years. These results showed that the GAI assimilation works best at the district or provincial level. In the context of the GEO Ag., the Joint Experiment of Crop Assessment and Monitoring (JECAM) was designed to enable the global agricultural monitoring community to compare such methods and results over a variety of regional cropping systems. For a network of test sites around the world, satellite and field measurements are currently collected and will be made available for collaborative effort. This experiment should facilitate international standards for data products and reporting, eventually supporting the development of a global system of systems for agricultural crop assessment and monitoring.

  11. Validation of Ocean Color Satellite Data Products in Under Sampled Marine Areas. Chapter 6

    NASA Technical Reports Server (NTRS)

    Subramaniam, Ajit; Hood, Raleigh R.; Brown, Christopher W.; Carpenter, Edward J.; Capone, Douglas G.

    2001-01-01

    The planktonic marine cyanobacterium, Trichodesmium sp., is broadly distributed throughout the oligotrophic marine tropical and sub-tropical oceans. Trichodesmium, which typically occurs in macroscopic bundles or colonies, is noteworthy for its ability to form large surface aggregations and to fix dinitrogen gas. The latter is important because primary production supported by N2 fixation can result in a net export of carbon from the surface waters to deep ocean and may therefore play a significant role in the global carbon cycle. However, information on the distribution and density of Trichodesmium from shipboard measurements through the oligotrophic oceans is very sparse. Such estimates are required to quantitatively estimate total global rates of N2 fixation. As a result current global rate estimates are highly uncertain. Thus in order to understand the broader biogeochemical importance of Trichodesmium and N2 fixation in the oceans, we need better methods to estimate the global temporal and spatial variability of this organism. One approach that holds great promise is satellite remote sensing. Satellite ocean color sensors are ideal instruments for estimating global phytoplankton biomass, especially that due to episodic blooms, because they provide relatively high frequency synoptic information over large areas. Trichodesmium has a combination of specific ultrastructural and biochemical features that lend themselves to identification of this organism by remote sensing. Specifically, these features are high backscatter due to the presence of gas vesicles, and absorption and fluorescence of phycoerythrin. The resulting optical signature is relatively unique and should be detectable with satellite ocean color sensors such as the Sea-Viewing Wide Field-of-view Sensor (SeaWiFS).

  12. Detecting and Cataloging Global Explosive Volcanism Using the IMS Infrasound Network

    NASA Astrophysics Data System (ADS)

    Matoza, R. S.; Green, D. N.; LE Pichon, A.; Fee, D.; Shearer, P. M.; Mialle, P.; Ceranna, L.

    2015-12-01

    Explosive volcanic eruptions are among the most powerful sources of infrasound observed on earth, with recordings routinely made at ranges of hundreds to thousands of kilometers. These eruptions can also inject large volumes of ash into heavily travelled aviation corridors, thus posing a significant societal and economic hazard. Detecting and counting the global occurrence of explosive volcanism helps with progress toward several goals in earth sciences and has direct applications in volcanic hazard mitigation. This project aims to build a quantitative catalog of global explosive volcanic activity using the International Monitoring System (IMS) infrasound network. We are developing methodologies to search systematically through IMS infrasound array detection bulletins to identify signals of volcanic origin. We combine infrasound signal association and source location using a brute-force, grid-search, cross-bearings approach. The algorithm corrects for a background prior rate of coherent infrasound signals in a global grid. When volcanic signals are identified, we extract metrics such as location, origin time, acoustic intensity, signal duration, and frequency content, compiling the results into a catalog. We are testing and validating our method on several well-known case studies, including the 2009 eruption of Sarychev Peak, Kuriles, the 2010 eruption of Eyjafjallajökull, Iceland, and the 2015 eruption of Calbuco, Chile. This work represents a step toward the goal of integrating IMS data products into global volcanic eruption early warning and notification systems. Additionally, a better characterization of volcanic signal detection helps improve understanding of operational event detection, discrimination, and association capabilities of the IMS network.

  13. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  14. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  15. Quantitative geomorphologic studies from spaceborne platforms

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    Although LANDSAT images of our planet represent a quantum improvement in the availability of a global image-data set for independent or comparative regional geomorphic studies of landforms, such images have several limitations which restrict their suitability for quantitative geomorphic investigations. The three most serious deficiencies are: (1) photogrammetric inaccuracies, (2) two-dimensional nature of the data, and (3) spatial resolution. These deficiencies are discussed, as well as the use of stereoscopic images and laser altimeter data.

  16. [Parameter sensitivity of simulating net primary productivity of Larix olgensis forest based on BIOME-BGC model].

    PubMed

    He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong

    2016-02-01

    Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.

  17. Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.

    PubMed

    Yearsley, Jon M; Sigwart, Julia D

    2011-01-01

    Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.

  18. Larval Transport Modeling of Deep-Sea Invertebrates Can Aid the Search for Undiscovered Populations

    PubMed Central

    Yearsley, Jon M.; Sigwart, Julia D.

    2011-01-01

    Background Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess. PMID:21857992

  19. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  20. Habitats and Natural Areas--Some Applications of the 1995-96 Forest Survey of Arkansas on the Conservation of Biodiversity in Arkansas

    Treesearch

    Douglas Zollner

    2001-01-01

    The conservation status and trend of rare species groups should be better in landscapes with more forest cover due to the presence of quantitatively more habitat, and in the case of aquatic species,qualitatively better habitat. Arkansas provides habitat for 97 species of plants and animals considered critically imperiled globally or imperiled globally.T hese 97 species...

  1. Quantitative analysis of global veterinary human resources.

    PubMed

    Kouba, V

    2003-12-01

    This analysis of global veterinary personnel was based on the available quantitative data reported by individual countries to international organisations. The analysis begins with a time series of globally reported numbers of veterinarians, starting in the year 1959 (140,391). In 2000 this number reached 691,379. Of this total, 27.77% of veterinarians were working as government officials, 15.38% were working in laboratories, universities and training institutions and 46.33% were working as private practitioners. The ratio of veterinarians to technicians was 1:0.63. The global average of resources serviced by each veterinarian was as follows: 8,760 inhabitants; 189 km2 of land area and 20 km2 of arable land; 1,925 cattle, 242 buffaloes, 87 horses, 1,309 pigs, 1,533 sheep and 20,714 chickens; in abattoirs: 401 slaughtered cattle, 699 slaughtered sheep and 1,674 slaughtered pigs; the production of 336 tonnes (t) of meat, 708 t cow milk and 74 t hen eggs; in international trade: 12 cattle, 23 sheep, 22 pigs, 1 horse, 1,086 chickens, 33 t meat and meat products; 2,289 units of livestock (50 minutes of annual veterinary working time for each unit). These averages were also analysed according to employment categories. The author also discusses factors influencing veterinary personnel analyses and planning.

  2. A computational interactome for prioritizing genes associated with complex agronomic traits in rice (Oryza sativa).

    PubMed

    Liu, Shiwei; Liu, Yihui; Zhao, Jiawei; Cai, Shitao; Qian, Hongmei; Zuo, Kaijing; Zhao, Lingxia; Zhang, Lida

    2017-04-01

    Rice (Oryza sativa) is one of the most important staple foods for more than half of the global population. Many rice traits are quantitative, complex and controlled by multiple interacting genes. Thus, a full understanding of genetic relationships will be critical to systematically identify genes controlling agronomic traits. We developed a genome-wide rice protein-protein interaction network (RicePPINet, http://netbio.sjtu.edu.cn/riceppinet) using machine learning with structural relationship and functional information. RicePPINet contained 708 819 predicted interactions for 16 895 non-transposable element related proteins. The power of the network for discovering novel protein interactions was demonstrated through comparison with other publicly available protein-protein interaction (PPI) prediction methods, and by experimentally determined PPI data sets. Furthermore, global analysis of domain-mediated interactions revealed RicePPINet accurately reflects PPIs at the domain level. Our studies showed the efficiency of the RicePPINet-based method in prioritizing candidate genes involved in complex agronomic traits, such as disease resistance and drought tolerance, was approximately 2-11 times better than random prediction. RicePPINet provides an expanded landscape of computational interactome for the genetic dissection of agronomically important traits in rice. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  3. Nanoscale steady-state temperature gradients within polymer nanocomposites undergoing continuous-wave photothermal heating from gold nanorods.

    PubMed

    Maity, Somsubhra; Wu, Wei-Chen; Tracy, Joseph B; Clarke, Laura I; Bochinski, Jason R

    2017-08-17

    Anisotropically-shaped metal nanoparticles act as nanoscale heaters via excitation of a localized surface plasmon resonance, utilizing a photothermal effect which converts the optical energy into local heat. Steady-state temperatures within a polymer matrix embedded with gold nanorods undergoing photothermal heating using continuous-wave excitation are measured in the immediate spatial vicinity of the nanoparticle (referred to as the local temperature) from observing the rate of physical rotation of the asymmetric nanoparticles within the locally created polymer melt. Average temperatures across the entire (mostly solid) sample (referred to as the global temperature) are simultaneously observed using a fluorescence method from randomly dispersed molecular emitters. Comparing these two independent measurements in films having varying concentrations of nanorods reveals the interplay between the local and global temperatures, clearly demonstrating the capability of these material samples to sustain large steady-state spatial temperature gradients when experiencing continuous-wave excitation photothermal heating. These results are discussed quantitatively. Illustrative imaging studies of nanofibers under photothermal heating also support the presence of a large temperature gradient. Photothermal heating in this manner has potential utility in creating unique thermal processing conditions for outcomes such as driving chemical reactions, inducing crystallinity changes, or enhancing degradation processes in a manner unachievable by conventional heating methods.

  4. Left-ventricle segmentation in real-time 3D echocardiography using a hybrid active shape model and optimal graph search approach

    NASA Astrophysics Data System (ADS)

    Zhang, Honghai; Abiose, Ademola K.; Campbell, Dwayne N.; Sonka, Milan; Martins, James B.; Wahle, Andreas

    2010-03-01

    Quantitative analysis of the left ventricular shape and motion patterns associated with left ventricular mechanical dyssynchrony (LVMD) is essential for diagnosis and treatment planning in congestive heart failure. Real-time 3D echocardiography (RT3DE) used for LVMD analysis is frequently limited by heavy speckle noise or partially incomplete data, thus a segmentation method utilizing learned global shape knowledge is beneficial. In this study, the endocardial surface of the left ventricle (LV) is segmented using a hybrid approach combining active shape model (ASM) with optimal graph search. The latter is used to achieve landmark refinement in the ASM framework. Optimal graph search translates the 3D segmentation into the detection of a minimum-cost closed set in a graph and can produce a globally optimal result. Various information-gradient, intensity distributions, and regional-property terms-are used to define the costs for the graph search. The developed method was tested on 44 RT3DE datasets acquired from 26 LVMD patients. The segmentation accuracy was assessed by surface positioning error and volume overlap measured for the whole LV as well as 16 standard LV regions. The segmentation produced very good results that were not achievable using ASM or graph search alone.

  5. Advanced Mass Spectrometric Methods for the Rapid and Quantitative Characterization of Proteomes

    DOE PAGES

    Smith, Richard D.

    2002-01-01

    Progress is reviewedmore » towards the development of a global strategy that aims to extend the sensitivity, dynamic range, comprehensiveness and throughput of proteomic measurements based upon the use of high performance separations and mass spectrometry. The approach uses high accuracy mass measurements from Fourier transform ion cyclotron resonance mass spectrometry (FTICR) to validate peptide ‘accurate mass tags’ (AMTs) produced by global protein enzymatic digestions for a specific organism, tissue or cell type from ‘potential mass tags’ tentatively identified using conventional tandem mass spectrometry (MS/MS). This provides the basis for subsequent measurements without the need for MS/ MS. High resolution capillary liquid chromatography separations combined with high sensitivity, and high resolution accurate FTICR measurements are shown to be capable of characterizing peptide mixtures of more than 10 5 components. The strategy has been initially demonstrated using the microorganisms Saccharomyces cerevisiae and Deinococcus radiodurans. Advantages of the approach include the high confidence of protein identification, its broad proteome coverage, high sensitivity, and the capability for stableisotope labeling methods for precise relative protein abundance measurements. Abbreviations : LC, liquid chromatography; FTICR, Fourier transform ion cyclotron resonance; AMT, accurate mass tag; PMT, potential mass tag; MMA, mass measurement accuracy; MS, mass spectrometry; MS/MS, tandem mass spectrometry; ppm, parts per million.« less

  6. Cenozoic Planktonic Marine Diatom Diversity and Correlation to Climate Change

    PubMed Central

    Lazarus, David; Barron, John; Renaudie, Johan; Diver, Patrick; Türke, Andreas

    2014-01-01

    Marine planktonic diatoms export carbon to the deep ocean, playing a key role in the global carbon cycle. Although commonly thought to have diversified over the Cenozoic as global oceans cooled, only two conflicting quantitative reconstructions exist, both from the Neptune deep-sea microfossil occurrences database. Total diversity shows Cenozoic increase but is sample size biased; conventional subsampling shows little net change. We calculate diversity from a separately compiled new diatom species range catalog, and recalculate Neptune subsampled-in-bin diversity using new methods to correct for increasing Cenozoic geographic endemism and decreasing Cenozoic evenness. We find coherent, substantial Cenozoic diversification in both datasets. Many living cold water species, including species important for export productivity, originate only in the latest Miocene or younger. We make a first quantitative comparison of diatom diversity to the global Cenozoic benthic ∂18O (climate) and carbon cycle records (∂13C, and 20-0 Ma pCO2). Warmer climates are strongly correlated with lower diatom diversity (raw: rho = .92, p<.001; detrended, r = .6, p = .01). Diatoms were 20% less diverse in the early late Miocene, when temperatures and pCO2 were only moderately higher than today. Diversity is strongly correlated to both ∂13C and pCO2 over the last 15 my (for both: r>.9, detrended r>.6, all p<.001), but only weakly over the earlier Cenozoic, suggesting increasingly strong linkage of diatom and climate evolution in the Neogene. Our results suggest that many living marine planktonic diatom species may be at risk of extinction in future warm oceans, with an unknown but potentially substantial negative impact on the ocean biologic pump and oceanic carbon sequestration. We cannot however extrapolate our my-scale correlations with generic climate proxies to anthropogenic time-scales of warming without additional species-specific information on proximate ecologic controls. PMID:24465441

  7. Inference of quantitative models of bacterial promoters from time-series reporter gene data.

    PubMed

    Stefan, Diana; Pinel, Corinne; Pinhal, Stéphane; Cinquemani, Eugenio; Geiselmann, Johannes; de Jong, Hidde

    2015-01-01

    The inference of regulatory interactions and quantitative models of gene regulation from time-series transcriptomics data has been extensively studied and applied to a range of problems in drug discovery, cancer research, and biotechnology. The application of existing methods is commonly based on implicit assumptions on the biological processes under study. First, the measurements of mRNA abundance obtained in transcriptomics experiments are taken to be representative of protein concentrations. Second, the observed changes in gene expression are assumed to be solely due to transcription factors and other specific regulators, while changes in the activity of the gene expression machinery and other global physiological effects are neglected. While convenient in practice, these assumptions are often not valid and bias the reverse engineering process. Here we systematically investigate, using a combination of models and experiments, the importance of this bias and possible corrections. We measure in real time and in vivo the activity of genes involved in the FliA-FlgM module of the E. coli motility network. From these data, we estimate protein concentrations and global physiological effects by means of kinetic models of gene expression. Our results indicate that correcting for the bias of commonly-made assumptions improves the quality of the models inferred from the data. Moreover, we show by simulation that these improvements are expected to be even stronger for systems in which protein concentrations have longer half-lives and the activity of the gene expression machinery varies more strongly across conditions than in the FliA-FlgM module. The approach proposed in this study is broadly applicable when using time-series transcriptome data to learn about the structure and dynamics of regulatory networks. In the case of the FliA-FlgM module, our results demonstrate the importance of global physiological effects and the active regulation of FliA and FlgM half-lives for the dynamics of FliA-dependent promoters.

  8. Global and Regional Brain Assessment with Quantitative MR Imaging in Patients with Prior Exposure to Linear Gadolinium-based Contrast Agents.

    PubMed

    Kuno, Hirofumi; Jara, Hernán; Buch, Karen; Qureshi, Muhammad Mustafa; Chapman, Margaret N; Sakai, Osamu

    2017-04-01

    Purpose To assess the association of global and regional brain relaxation times in patients with prior exposure to linear gadolinium-based contrast agents (GBCAs). Materials and Methods The institutional review board approved this cross-sectional study. Thirty-five patients (nine who had received GBCA gadopentetate dimeglumine injections previously [one to eight times] and 26 patients who did not) who underwent brain magnetic resonance (MR) imaging with a mixed fast spin-echo pulse sequence were assessed. The whole brain was segmented according to white and gray matter by using a dual-clustering algorithm. In addition, regions of interest were measured in the globus pallidus, dentate nucleus, thalamus, and pons. The Mann-Whitney U test was used to assess the difference between groups. Multiple regression analysis was performed to assess the association of T1 and T2 with prior GBCA exposure. Results T1 values of gray matter were significantly shorter for patients with than for patients without prior GBCA exposure (P = .022). T1 of the gray matter of the whole brain (P < .001), globus pallidus (P = .002), dentate nucleus (P = .046), and thalamus (P = .026) and T2 of the whole brain (P = .004), dentate nucleus (P = .023), and thalamus (P = .002) showed a significant correlation with the accumulated dose of previous GBCA administration. There was no significant correlation between T1 and the accumulated dose of previous GBCA injections in the white matter (P = .187). Conclusion Global and regional quantitative assessments of T1 and T2 demonstrated an association with prior GBCA exposure, especially for gray matter structures. The results of this study confirm previous research findings that there is gadolinium deposition in wider distribution throughout the brain. © RSNA, 2016 Online supplemental material is available for this article.

  9. Cenozoic planktonic marine diatom diversity and correlation to climate change.

    PubMed

    Lazarus, David; Barron, John; Renaudie, Johan; Diver, Patrick; Türke, Andreas

    2014-01-01

    Marine planktonic diatoms export carbon to the deep ocean, playing a key role in the global carbon cycle. Although commonly thought to have diversified over the Cenozoic as global oceans cooled, only two conflicting quantitative reconstructions exist, both from the Neptune deep-sea microfossil occurrences database. Total diversity shows Cenozoic increase but is sample size biased; conventional subsampling shows little net change. We calculate diversity from a separately compiled new diatom species range catalog, and recalculate Neptune subsampled-in-bin diversity using new methods to correct for increasing Cenozoic geographic endemism and decreasing Cenozoic evenness. We find coherent, substantial Cenozoic diversification in both datasets. Many living cold water species, including species important for export productivity, originate only in the latest Miocene or younger. We make a first quantitative comparison of diatom diversity to the global Cenozoic benthic ∂(18)O (climate) and carbon cycle records (∂(13)C, and 20-0 Ma pCO2). Warmer climates are strongly correlated with lower diatom diversity (raw: rho = .92, p<.001; detrended, r = .6, p = .01). Diatoms were 20% less diverse in the early late Miocene, when temperatures and pCO2 were only moderately higher than today. Diversity is strongly correlated to both ∂(13)C and pCO2 over the last 15 my (for both: r>.9, detrended r>.6, all p<.001), but only weakly over the earlier Cenozoic, suggesting increasingly strong linkage of diatom and climate evolution in the Neogene. Our results suggest that many living marine planktonic diatom species may be at risk of extinction in future warm oceans, with an unknown but potentially substantial negative impact on the ocean biologic pump and oceanic carbon sequestration. We cannot however extrapolate our my-scale correlations with generic climate proxies to anthropogenic time-scales of warming without additional species-specific information on proximate ecologic controls.

  10. Topex/Poseidon: A United States/France mission. Oceanography from space: The oceans and climate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The TOPEX/POSEIDON space mission, sponsored by NASA and France's space agency, the Centre National d'Etudes Spatiales (CNES), will give new observations of the Earth from space to gain a quantitative understanding of the role of ocean currents in climate change. Rising atmospheric concentrations of carbon dioxide and other 'greenhouse gases' produced as a result of human activities could generate a global warming, followed by an associated rise in sea level. The satellite will use radar altimetry to measure sea-surface height and will be tracked by three independent systems to yield accurate topographic maps over the dimensions of entire ocean basins. The satellite data, together with the Tropical Ocean and Global Atmosphere (TOGA) program and the World Ocean Circulation Experiment (WOCE) measurements, will be analyzed by an international scientific team. By merging the satellite observations with TOGA and WOCE findings, the scientists will establish the extensive data base needed for the quantitative description and computer modeling of ocean circulation. The ocean models will eventually be coupled with atmospheric models to lay the foundation for predictions of global climate change.

  11. Spatial organization of RNA polymerase II inside a mammalian cell nucleus revealed by reflected light-sheet superresolution microscopy

    PubMed Central

    Zhao, Ziqing W.; Roy, Rahul; Gebhardt, J. Christof M.; Suter, David M.; Chapman, Alec R.; Xie, X. Sunney

    2014-01-01

    Superresolution microscopy based on single-molecule centroid determination has been widely applied to cellular imaging in recent years. However, quantitative imaging of the mammalian nucleus has been challenging due to the lack of 3D optical sectioning methods for normal-sized cells, as well as the inability to accurately count the absolute copy numbers of biomolecules in highly dense structures. Here we report a reflected light-sheet superresolution microscopy method capable of imaging inside the mammalian nucleus with superior signal-to-background ratio as well as molecular counting with single-copy accuracy. Using reflected light-sheet superresolution microscopy, we probed the spatial organization of transcription by RNA polymerase II (RNAP II) molecules and quantified their global extent of clustering inside the mammalian nucleus. Spatiotemporal clustering analysis that leverages on the blinking photophysics of specific organic dyes showed that the majority (>70%) of the transcription foci originate from single RNAP II molecules, and no significant clustering between RNAP II molecules was detected within the length scale of the reported diameter of “transcription factories.” Colocalization measurements of RNAP II molecules equally labeled by two spectrally distinct dyes confirmed the primarily unclustered distribution, arguing against a prevalent existence of transcription factories in the mammalian nucleus as previously proposed. The methods developed in our study pave the way for quantitative mapping and stoichiometric characterization of key biomolecular species deep inside mammalian cells. PMID:24379392

  12. Automatic and Objective Assessment of Alternating Tapping Performance in Parkinson's Disease

    PubMed Central

    Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker

    2013-01-01

    This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’) and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping. PMID:24351667

  13. Accurate computer-aided quantification of left ventricular parameters: experience in 1555 cardiac magnetic resonance studies from the Framingham Heart Study.

    PubMed

    Hautvast, Gilion L T F; Salton, Carol J; Chuang, Michael L; Breeuwer, Marcel; O'Donnell, Christopher J; Manning, Warren J

    2012-05-01

    Quantitative analysis of short-axis functional cardiac magnetic resonance images can be performed using automatic contour detection methods. The resulting myocardial contours must be reviewed and possibly corrected, which can be time-consuming, particularly when performed across all cardiac phases. We quantified the impact of manual contour corrections on both analysis time and quantitative measurements obtained from left ventricular short-axis cine images acquired from 1555 participants of the Framingham Heart Study Offspring cohort using computer-aided contour detection methods. The total analysis time for a single case was 7.6 ± 1.7 min for an average of 221 ± 36 myocardial contours per participant. This included 4.8 ± 1.6 min for manual contour correction of 2% of all automatically detected endocardial contours and 8% of all automatically detected epicardial contours. However, the impact of these corrections on global left ventricular parameters was limited, introducing differences of 0.4 ± 4.1 mL for end-diastolic volume, -0.3 ± 2.9 mL for end-systolic volume, 0.7 ± 3.1 mL for stroke volume, and 0.3 ± 1.8% for ejection fraction. We conclude that left ventricular functional parameters can be obtained under 5 min from short-axis functional cardiac magnetic resonance images using automatic contour detection methods. Manual correction more than doubles analysis time, with minimal impact on left ventricular volumes and ejection fraction. Copyright © 2011 Wiley Periodicals, Inc.

  14. Short-term techniques for monitoring coral reefs: Review, results, and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, G.S.; Hunte, W.

    1994-12-31

    The health of coral reefs is in question on a global scale. The degradation of reefs has been attributed to both natural (e.g., el nino, crown-of-thorns, and hurricanes) and anthropogenic (e.g., sedimentation, nutrient overloading, oil spills, and thermal pollution) factors. Demonstrating the deleterious effects of lethal factors has not been difficult. However, it has been more difficult to quantitatively link those factors which do not cause rapid coral mortality to reef degradation. Classic techniques, such as cross-transplantation and x-ray analysis of growth bands, have proven to be successful bioassessments of chronic exposure to stressful conditions. The resolution of these techniquesmore » generally limits their usefulness as only long-term exposure (months to years) can provide quantitative differences between impacted and controlled conditions. Short-term monitoring techniques using corals have received relatively little attention from researchers. Two short-term methods have been successfully used to discriminated polluted from less-polluted sites in Barbados. The first is based on adult growth in several coral species. The second focuses on growth and survival of newly-settled juvenile corals. Both methods allowed discrimination in less than two weeks. These methods and others need to be evaluated and standardized in order to permit better, more efficient monitoring of the worlds reefs. Recommendations will be made on what life-history characteristics should be considered when choosing a coral species for use in bioassessment studies.« less

  15. Automatic and objective assessment of alternating tapping performance in Parkinson's disease.

    PubMed

    Memedi, Mevludin; Khan, Taha; Grenholm, Peter; Nyholm, Dag; Westin, Jerker

    2013-12-09

    This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson's disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions ('speed', 'accuracy', 'fatigue' and 'arrhythmia') and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson's Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

  16. Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS.

    PubMed

    Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young

    2015-11-09

    In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry.

  17. Global Profiling of Various Metabolites in Platycodon grandiflorum by UPLC-QTOF/MS

    PubMed Central

    Lee, Jae Won; Ji, Seung-Heon; Kim, Geum-Soog; Song, Kyung-Sik; Um, Yurry; Kim, Ok Tae; Lee, Yi; Hong, Chang Pyo; Shin, Dong-Ho; Kim, Chang-Kug; Lee, Seung-Eun; Ahn, Young-Sup; Lee, Dae-Young

    2015-01-01

    In this study, a method of metabolite profiling based on UPLC-QTOF/MS was developed to analyze Platycodon grandiflorum. In the optimal UPLC, various metabolites, including major platycosides, were separated well in 15 min. The metabolite extraction protocols were also optimized by selecting a solvent for use in the study, the ratio of solvent to sample and sonication time. This method was used to profile two different parts of P. grandiflorum, i.e., the roots of P. grandiflorum (PR) and the stems and leaves of P. grandiflorum (PS), in the positive and negative ion modes. As a result, PR and PS showed qualitatively and quantitatively different metabolite profiles. Furthermore, their metabolite compositions differed according to individual plant samples. These results indicate that the UPLC-QTOF/MS-based profiling method is a good tool to analyze various metabolites in P. grandiflorum. This metabolomics approach can also be applied to evaluate the overall quality of P. grandiflorum, as well as to discriminate the cultivars for the medicinal plant industry. PMID:26569219

  18. Integration of Network Topological and Connectivity Properties for Neuroimaging Classification

    PubMed Central

    Jie, Biao; Gao, Wei; Wang, Qian; Wee, Chong-Yaw

    2014-01-01

    Rapid advances in neuroimaging techniques have provided an efficient and noninvasive way for exploring the structural and functional connectivity of the human brain. Quantitative measurement of abnormality of brain connectivity in patients with neurodegenerative diseases, such as mild cognitive impairment (MCI) and Alzheimer’s disease (AD), have also been widely reported, especially at a group level. Recently, machine learning techniques have been applied to the study of AD and MCI, i.e., to identify the individuals with AD/MCI from the healthy controls (HCs). However, most existing methods focus on using only a single property of a connectivity network, although multiple network properties, such as local connectivity and global topological properties, can potentially be used. In this paper, by employing multikernel based approach, we propose a novel connectivity based framework to integrate multiple properties of connectivity network for improving the classification performance. Specifically, two different types of kernels (i.e., vector-based kernel and graph kernel) are used to quantify two different yet complementary properties of the network, i.e., local connectivity and global topological properties. Then, multikernel learning (MKL) technique is adopted to fuse these heterogeneous kernels for neuroimaging classification. We test the performance of our proposed method on two different data sets. First, we test it on the functional connectivity networks of 12 MCI and 25 HC subjects. The results show that our method achieves significant performance improvement over those using only one type of network property. Specifically, our method achieves a classification accuracy of 91.9%, which is 10.8% better than those by single network-property-based methods. Then, we test our method for gender classification on a large set of functional connectivity networks with 133 infants scanned at birth, 1 year, and 2 years, also demonstrating very promising results. PMID:24108708

  19. Re-assessing the relationship between sporozoite dose and incubation period in Plasmodium vivax malaria: a systematic re-analysis.

    PubMed

    Lover, Andrew A; Coker, Richard J

    2014-05-01

    Infections with the malaria parasite Plasmodium vivax are noteworthy for potentially very long incubation periods (6-9 months), which present a major barrier to disease elimination. Increased sporozoite challenge has been reported to be associated with both shorter incubation and pre-patent periods in a range of human challenge studies. However, this evidence base has scant empirical foundation, as these historical analyses were limited by available analytic methods, and provides no quantitative estimates of effect size. Following a comprehensive literature search, we re-analysed all identified studies using survival and/or logistic models plus contingency tables. We have found very weak evidence for dose-dependence at entomologically plausible inocula levels. These results strongly suggest that sporozoite dosage is not an important driver of long-latency. Evidence presented suggests that parasite strain and vector species have quantitatively greater impacts, and the potential existence of a dose threshold for human dose-response to sporozoites. Greater consideration of the complex interplay between these aspects of vectors and parasites are important for human challenge experiments, vaccine trials, and epidemiology towards global malaria elimination.

  20. Quantitative brain tissue oximetry, phase spectroscopy and imaging the range of homeostasis in piglet brain.

    PubMed

    Chance, Britton; Ma, Hong Yan; Nioka, Shoko

    2003-01-01

    The quantification of tissue oxygen by frequency or time domain methods has been discussed in a number of prior publications where the meaning of the tissue hemoglobin oxygen saturation was unclear and where the CW instruments were unsuitable for proper quantitative measurements [1, 2]. The development of the IQ Phase Meter has greatly simplified and made reliable the difficult determination of precise phase and amplitude signals from brain. This contribution reports on the calibration of the instrument in model systems and the use of the instrument to measure tissue saturation (StO2) in a small animal model. In addition, a global interpretation of the meaning of tissue oxygen has been formulated based on the idea that autoregulation will maintain tissue oxygen at a fixed value over a range of arterial and venous oxygen values over the range of autoregulation. Beyond that range, the tissue oxygen is still correctly measured but, as expected, approaches the arterial saturation at low metabolic rates and the venous saturation at high metabolic rates of mitochondria.

  1. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    PubMed Central

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  2. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    PubMed

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  3. Asia's changing role in global climate change.

    PubMed

    Siddiqi, Toufiq A

    2008-10-01

    Asia's role in global climate change has evolved significantly from the time when the Kyoto Protocol was being negotiated. Emissions of carbon dioxide, the principal greenhouse gas, from energy use in Asian countries now exceed those from the European Union or North America. Three of the top five emitters-China, India, and Japan, are Asian countries. Any meaningful global effort to address global climate change requires the active cooperation of these and other large Asian countries, if it is to succeed. Issues of equity between countries, within countries, and between generations, need to be tackled. Some quantitative current and historic data to illustrate the difficulties involved are provided, and one approach to making progress is suggested.

  4. Efficient correction of wavefront inhomogeneities in X-ray holographic nanotomography by random sample displacement

    NASA Astrophysics Data System (ADS)

    Hubert, Maxime; Pacureanu, Alexandra; Guilloud, Cyril; Yang, Yang; da Silva, Julio C.; Laurencin, Jerome; Lefebvre-Joud, Florence; Cloetens, Peter

    2018-05-01

    In X-ray tomography, ring-shaped artifacts present in the reconstructed slices are an inherent problem degrading the global image quality and hindering the extraction of quantitative information. To overcome this issue, we propose a strategy for suppression of ring artifacts originating from the coherent mixing of the incident wave and the object. We discuss the limits of validity of the empty beam correction in the framework of a simple formalism. We then deduce a correction method based on two-dimensional random sample displacement, with minimal cost in terms of spatial resolution, acquisition, and processing time. The method is demonstrated on bone tissue and on a hydrogen electrode of a ceramic-metallic solid oxide cell. Compared to the standard empty beam correction, we obtain high quality nanotomography images revealing detailed object features. The resulting absence of artifacts allows straightforward segmentation and posterior quantification of the data.

  5. A Subpixel Classification of Multispectral Satellite Imagery for Interpetation of Tundra-Taiga Ecotone Vegetation (Case Study on Tuliok River Valley, Khibiny, Russia)

    NASA Astrophysics Data System (ADS)

    Mikheeva, A. I.; Tutubalina, O. V.; Zimin, M. V.; Golubeva, E. I.

    2017-12-01

    The tundra-taiga ecotone plays significant role in northern ecosystems. Due to global climatic changes, the vegetation of the ecotone is the key object of many remote-sensing studies. The interpretation of vegetation and nonvegetation objects of the tundra-taiga ecotone on satellite imageries of a moderate resolution is complicated by the difficulty of extracting these objects from the spectral and spatial mixtures within a pixel. This article describes a method for the subpixel classification of Terra ASTER satellite image for vegetation mapping of the tundra-taiga ecotone in the Tuliok River, Khibiny Mountains, Russia. It was demonstrated that this method allows to determine the position of the boundaries of ecotone objects and their abundance on the basis of quantitative criteria, which provides a more accurate characteristic of ecotone vegetation when compared to the per-pixel approach of automatic imagery interpretation.

  6. High-throughput sequencing methods to study neuronal RNA-protein interactions.

    PubMed

    Ule, Jernej

    2009-12-01

    UV-cross-linking and RNase protection, combined with high-throughput sequencing, have provided global maps of RNA sites bound by individual proteins or ribosomes. Using a stringent purification protocol, UV-CLIP (UV-cross-linking and immunoprecipitation) was able to identify intronic and exonic sites bound by splicing regulators in mouse brain tissue. Ribosome profiling has been used to quantify ribosome density on budding yeast mRNAs under different environmental conditions. Post-transcriptional regulation in neurons requires high spatial and temporal precision, as is evident from the role of localized translational control in synaptic plasticity. It remains to be seen if the high-throughput methods can be applied quantitatively to study the dynamics of RNP (ribonucleoprotein) remodelling in specific neuronal populations during the neurodegenerative process. It is certain, however, that applications of new biochemical techniques followed by high-throughput sequencing will continue to provide important insights into the mechanisms of neuronal post-transcriptional regulation.

  7. Modeling of Convective-Stratiform Precipitation Processes: Sensitivity to Partitioning Methods

    NASA Technical Reports Server (NTRS)

    Lang, S. E.; Tao, W.-K.; Simpson, J.; Ferrier, B.; Starr, David OC. (Technical Monitor)

    2001-01-01

    Six different convective-stratiform separation techniques, including a new technique that utilizes the ratio of vertical and terminal velocities, are compared and evaluated using two-dimensional numerical simulations of a tropical [Tropical Ocean Global Atmosphere Coupled Ocean Atmosphere Response Experiment (TOGA COARE)] and midlatitude continental [Preliminary Regional Experiment for STORM-Central (PRESTORM)] squall line. Comparisons are made in terms of rainfall, cloud coverage, mass fluxes, apparent heating and moistening, mean hydrometeor profiles, CFADs (Contoured Frequency with Altitude Diagrams), microphysics, and latent heating retrieval. Overall, it was found that the different separation techniques produced results that qualitatively agreed. However, the quantitative differences were significant. Observational comparisons were unable to conclusively evaluate the performance of the techniques. Latent heating retrieval was shown to be sensitive to the use of separation technique mainly due to the stratiform region for methods that found very little stratiform rain.

  8. The Analysis of Duocentric Social Networks: A Primer.

    PubMed

    Kennedy, David P; Jackson, Grace L; Green, Harold D; Bradbury, Thomas N; Karney, Benjamin R

    2015-02-01

    Marriages and other intimate partnerships are facilitated or constrained by the social networks within which they are embedded. To date, methods used to assess the social networks of couples have been limited to global ratings of social network characteristics or network data collected from each partner separately. In the current article, the authors offer new tools for expanding on the existing literature by describing methods of collecting and analyzing duocentric social networks, that is, the combined social networks of couples. They provide an overview of the key considerations for measuring duocentric networks, such as how and why to combine separate network interviews with partners into one shared duocentric network, the number of network members to assess, and the implications of different network operationalizations. They illustrate these considerations with analyses of social network data collected from 57 low-income married couples, presenting visualizations and quantitative measures of network composition and structure.

  9. Variability of Protein Structure Models from Electron Microscopy.

    PubMed

    Monroe, Lyman; Terashi, Genki; Kihara, Daisuke

    2017-04-04

    An increasing number of biomolecular structures are solved by electron microscopy (EM). However, the quality of structure models determined from EM maps vary substantially. To understand to what extent structure models are supported by information embedded in EM maps, we used two computational structure refinement methods to examine how much structures can be refined using a dataset of 49 maps with accompanying structure models. The extent of structure modification as well as the disagreement between refinement models produced by the two computational methods scaled inversely with the global and the local map resolutions. A general quantitative estimation of deviations of structures for particular map resolutions are provided. Our results indicate that the observed discrepancy between the deposited map and the refined models is due to the lack of structural information present in EM maps and thus these annotations must be used with caution for further applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Comparisons of ionospheric electron density distributions reconstructed by GPS computerized tomography, backscatter ionograms, and vertical ionograms

    NASA Astrophysics Data System (ADS)

    Zhou, Chen; Lei, Yong; Li, Bofeng; An, Jiachun; Zhu, Peng; Jiang, Chunhua; Zhao, Zhengyu; Zhang, Yuannong; Ni, Binbin; Wang, Zemin; Zhou, Xuhua

    2015-12-01

    Global Positioning System (GPS) computerized ionosphere tomography (CIT) and ionospheric sky wave ground backscatter radar are both capable of measuring the large-scale, two-dimensional (2-D) distributions of ionospheric electron density (IED). Here we report the spatial and temporal electron density results obtained by GPS CIT and backscatter ionogram (BSI) inversion for three individual experiments. Both the GPS CIT and BSI inversion techniques demonstrate the capability and the consistency of reconstructing large-scale IED distributions. To validate the results, electron density profiles obtained from GPS CIT and BSI inversion are quantitatively compared to the vertical ionosonde data, which clearly manifests that both methods output accurate information of ionopsheric electron density and thereby provide reliable approaches to ionospheric soundings. Our study can improve current understanding of the capability and insufficiency of these two methods on the large-scale IED reconstruction.

  11. Working in the global world: looking for more modern workplace overseas

    NASA Astrophysics Data System (ADS)

    Joko Pitoyo, Agus

    2018-04-01

    International labor migration overseas is very complex and involves multiple dimensional issues, ranging from economic to social, political, and cultural issues. A very long history migration has affect the live of households which has been dependent very much on it, not only in economic terms but also in broader aspects.This research aim is to understand the history and process of international labor migration from the District of Ponorogo. The research was done using mixed method design, combining quantitative and qualitative approaches. Firstly, a survey was done by involving about a thousand households and followed with qualitative method applying in-depth interviews and FGDs to sub-samples households and some key informants. The results have shown that intergenerational migration has established in Ponogoro. There has been an expansion of destination countries with the establishment of many international migration routes. Migrant workers from Ponorogo have traveled to very distant continents and to more modern countries.

  12. Paradoxes of the comparative analysis of ground-based and satellite geodetic measurements in recent geodynamics

    NASA Astrophysics Data System (ADS)

    Kuzmin, Yu. O.

    2017-11-01

    The comparative analysis of the Earth's surface deformations measured by ground-based and satellite geodetic methods on the regional and zonal measurement scales is carried out. The displacement velocities and strain rates are compared in the active regions such as Turkmenian-Iranian zone of interaction of the Arabian and Eurasian lithospheric plates and the Kamchatka segment of the subduction of the Pacific Plate beneath the Okotsk Plate. The comparison yields a paradoxical result. With the qualitatively identical kinematics of the motion, the quantitative characteristics of the displacement velocities and rates of strain revealed by the observations using the global navigational satellite system (GNSS) are by 1-2 orders of magnitude higher than those estimated by the more accurate methods of ground-based geodesy. For resolving the revealed paradoxes, it is required to set up special studies on the joint analysis of ground-based and satellite geodetic data from the combined observation sites.

  13. Development of Next Generation Lifetime PSP Imaging Systems

    NASA Technical Reports Server (NTRS)

    Watkins, A. Neal; Jordan, Jeffrey D.; Leighty, Bradley D.; Ingram, JoAnne L.; Oglesby, Donald M.

    2002-01-01

    This paper describes a lifetime PSP system that has recently been developed using pulsed light-emitting diode (LED) lamps and a new interline transfer CCD camera technology. This system alleviates noise sources associated with lifetime PSP systems that use either flash-lamp or laser excitation sources and intensified CCD cameras for detection. Calibration curves have been acquired for a variety of PSP formulations using this system, and a validation test was recently completed in the Subsonic Aerodynamic Research Laboratory (SARL) at Wright-Patterson Air Force Base (WPAFB). In this test, global surface pressure distributions were recovered using both a standard intensity-based method and the new lifetime system. Results from the lifetime system agree both qualitatively and quantitatively with those measured using the intensity-based method. Finally, an advanced lifetime imaging technique capable of measuring temperature and pressure simultaneously is introduced and initial results are presented.

  14. Development of Nested Socioeconomic Storylines for Climate Change IAV Applications (Invited)

    NASA Astrophysics Data System (ADS)

    Preston, B. L.; Absar, M.

    2013-12-01

    Socioeconomic scenarios are important for understanding future societal consequences of climate and weather. The global shared socioeconomic pathways (SSPs) represent a new opportunity for coordinated development and application of such scenarios to improve the representation of alternative societal development pathways within climate change consequence analysis. However, capitalizing on this opportunity necessitates bridging the scale disparity between the global SSPs and the regional/local context for which many impact, adaptation and vulnerability (IAV) studies are conducted. To this end, we adopted the Factor, Actor, and Sector methodology to develop a set of qualitative national and sub-national socioeconomic storylines for the United States and U.S. Southeast using the global SSPs as boundary conditions. In particular, our study sought to develop storylines to explore alternative socioeconomic futures for the U.S. Southeast and their implications for adaptive capacity of the region's energy, water, and agricultural sectors. These storylines subsequently serve as the foundation for a range of downstream IAV applications. These include qualitative vulnerability analysis to explore interactions between energy, water, and agriculture in a changing climate; as well as quantitative impact assessment where regional storylines are used to establish modeling parameters within a biophysical crop model. Such methods and applications illustrate potentially useful opportunities for routinizing the use of SSP-based storylines in IAV studies.

  15. Projected Heat Wave Characteristics over the Korean Peninsula During the Twenty-First Century

    NASA Astrophysics Data System (ADS)

    Shin, Jongsoo; Olson, Roman; An, Soon-Il

    2018-02-01

    Climate change is expected to increase temperatures globally, and consequently more frequent, longer, and hotter heat waves are likely to occur. Ambiguity in defining heat waves appropriately makes it difficult to compare changes in heat wave events over time. This study provides a quantitative definition of a heat wave and makes probabilistic heat wave projections for the Korean Peninsula under two global warming scenarios. Changes to heat waves under global warming are investigated using the representative concentration pathway 4.5 (RCP4.5) and 8.5 (RCP8.5) experiments from 30 coupled models participating in phase five of the Coupled Model Inter-comparison Project. Probabilistic climate projections from multi-model ensembles have been constructed using both simple and weighted averaging. Results from both methods are similar and show that heat waves will be more intense, frequent, and longer lasting. These trends are more apparent under the RCP8.5 scenario as compared to the RCP4.5 scenario. Under the RCP8.5 scenario, typical heat waves are projected to become stronger than any heat wave experienced in the recent measurement record. Furthermore, under this scenario, it cannot be ruled out that Korea will experience heat wave conditions spanning almost an entire summer before the end of the 21st century.

  16. [Human-robot global Simulink modeling and analysis for an end-effector upper limb rehabilitation robot].

    PubMed

    Liu, Yali; Ji, Linhong

    2018-02-01

    Robot rehabilitation has been a primary therapy method for the urgent rehabilitation demands of paralyzed patients after a stroke. The parameters in rehabilitation training such as the range of the training, which should be adjustable according to each participant's functional ability, are the key factors influencing the effectiveness of rehabilitation therapy. Therapists design rehabilitation projects based on the semiquantitative functional assessment scales and their experience. But these therapies based on therapists' experience cannot be implemented in robot rehabilitation therapy. This paper modeled the global human-robot by Simulink in order to analyze the relationship between the parameters in robot rehabilitation therapy and the patients' movement functional abilities. We compared the shoulder and elbow angles calculated by simulation with the angles recorded by motion capture system while the healthy subjects completed the simulated action. Results showed there was a remarkable correlation between the simulation data and the experiment data, which verified the validity of the human-robot global Simulink model. Besides, the relationship between the circle radius in the drawing tasks in robot rehabilitation training and the active movement degrees of shoulder as well as elbow was also matched by a linear, which also had a remarkable fitting coefficient. The matched linear can be a quantitative reference for the robot rehabilitation training parameters.

  17. A theory for protein dynamics: Global anisotropy and a normal mode approach to local complexity

    NASA Astrophysics Data System (ADS)

    Copperman, Jeremy; Romano, Pablo; Guenza, Marina

    2014-03-01

    We propose a novel Langevin equation description for the dynamics of biological macromolecules by projecting the solvent and all atomic degrees of freedom onto a set of coarse-grained sites at the single residue level. We utilize a multi-scale approach where molecular dynamic simulations are performed to obtain equilibrium structural correlations input to a modified Rouse-Zimm description which can be solved analytically. The normal mode solution provides a minimal basis set to account for important properties of biological polymers such as the anisotropic global structure, and internal motion on a complex free-energy surface. This multi-scale modeling method predicts the dynamics of both global rotational diffusion and constrained internal motion from the picosecond to the nanosecond regime, and is quantitative when compared to both simulation trajectory and NMR relaxation times. Utilizing non-equilibrium sampling techniques and an explicit treatment of the free-energy barriers in the mode coordinates, the model is extended to include biologically important fluctuations in the microsecond regime, such as bubble and fork formation in nucleic acids, and protein domain motion. This work supported by the NSF under the Graduate STEM Fellows in K-12 Education (GK-12) program, grant DGE-0742540 and NSF grant DMR-0804145, computational support from XSEDE and ACISS.

  18. GPHMM: an integrated hidden Markov model for identification of copy number alteration and loss of heterozygosity in complex tumor samples using whole genome SNP arrays

    PubMed Central

    Li, Ao; Liu, Zongzhi; Lezon-Geyda, Kimberly; Sarkar, Sudipa; Lannin, Donald; Schulz, Vincent; Krop, Ian; Winer, Eric; Harris, Lyndsay; Tuck, David

    2011-01-01

    There is an increasing interest in using single nucleotide polymorphism (SNP) genotyping arrays for profiling chromosomal rearrangements in tumors, as they allow simultaneous detection of copy number and loss of heterozygosity with high resolution. Critical issues such as signal baseline shift due to aneuploidy, normal cell contamination, and the presence of GC content bias have been reported to dramatically alter SNP array signals and complicate accurate identification of aberrations in cancer genomes. To address these issues, we propose a novel Global Parameter Hidden Markov Model (GPHMM) to unravel tangled genotyping data generated from tumor samples. In contrast to other HMM methods, a distinct feature of GPHMM is that the issues mentioned above are quantitatively modeled by global parameters and integrated within the statistical framework. We developed an efficient EM algorithm for parameter estimation. We evaluated performance on three data sets and show that GPHMM can correctly identify chromosomal aberrations in tumor samples containing as few as 10% cancer cells. Furthermore, we demonstrated that the estimation of global parameters in GPHMM provides information about the biological characteristics of tumor samples and the quality of genotyping signal from SNP array experiments, which is helpful for data quality control and outlier detection in cohort studies. PMID:21398628

  19. Markerless attenuation correction for carotid MRI surface receiver coils in combined PET/MR imaging

    NASA Astrophysics Data System (ADS)

    Eldib, Mootaz; Bini, Jason; Robson, Philip M.; Calcagno, Claudia; Faul, David D.; Tsoumpas, Charalampos; Fayad, Zahi A.

    2015-06-01

    The purpose of the study was to evaluate the effect of attenuation of MR coils on quantitative carotid PET/MR exams. Additionally, an automated attenuation correction method for flexible carotid MR coils was developed and evaluated. The attenuation of the carotid coil was measured by imaging a uniform water phantom injected with 37 MBq of 18F-FDG in a combined PET/MR scanner for 24 min with and without the coil. In the same session, an ultra-short echo time (UTE) image of the coil on top of the phantom was acquired. Using a combination of rigid and non-rigid registration, a CT-based attenuation map was registered to the UTE image of the coil for attenuation and scatter correction. After phantom validation, the effect of the carotid coil attenuation and the attenuation correction method were evaluated in five subjects. Phantom studies indicated that the overall loss of PET counts due to the coil was 6.3% with local region-of-interest (ROI) errors reaching up to 18.8%. Our registration method to correct for attenuation from the coil decreased the global error and local error (ROI) to 0.8% and 3.8%, respectively. The proposed registration method accurately captured the location and shape of the coil with a maximum spatial error of 2.6 mm. Quantitative analysis in human studies correlated with the phantom findings, but was dependent on the size of the ROI used in the analysis. MR coils result in significant error in PET quantification and thus attenuation correction is needed. The proposed strategy provides an operator-free method for attenuation and scatter correction for a flexible MRI carotid surface coil for routine clinical use.

  20. Quantitative Characterization of Spurious Gibbs Waves in 45 CMIP5 Models

    NASA Astrophysics Data System (ADS)

    Geil, K. L.; Zeng, X.

    2014-12-01

    Gibbs oscillations appear in global climate models when representing fields, such as orography, that contain discontinuities or sharp gradients. It has been known for decades that the oscillations are associated with the transformation of the truncated spectral representation of a field to physical space and that the oscillations can also be present in global models that do not use spectral methods. The spurious oscillations are potentially detrimental to model simulations (e.g., over ocean) and this work provides a quantitative characterization of the Gibbs oscillations that appear across the Coupled Model Intercomparison Project Phase 5 (CMIP5) models. An ocean transect running through the South Pacific High toward the Andes is used to characterize the oscillations in ten different variables. These oscillations are found to be stationary and hence are not caused by (physical) waves in the atmosphere. We quantify the oscillation amplitude using the root mean square difference (RMSD) between the transect of a variable and its running mean (rather than the constant mean across the transect). We also compute the RMSD to interannual variability (IAV) ratio, which provides a relative measure of the oscillation amplitude. Of the variables examined, the largest RMSD values exist in the surface pressure field of spectral models, while the smallest RMSD values within the surface pressure field come from models that use finite difference (FD) techniques. Many spectral models have a surface pressure RMSD that is 2 to 15 times greater than IAV over the transect and an RMSD:IAV ratio greater than one for many other variables including surface temperature, incoming shortwave radiation at the surface, incoming longwave radiation at the surface, and total cloud fraction. In general, the FD models out-perform the spectral models, but not all the spectral models have large amplitude oscillations and there are a few FD models where the oscillations do appear. Finally, we present a brief comparison of the numerical methods of a select few models to better understand their Gibbs oscillations.

  1. Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism

    PubMed Central

    Mossahebi, Sina; Zhu, Simeng; Chen, Howard; Shmuylovich, Leonid; Ghosh, Erina; Kovács, Sándor J.

    2014-01-01

    Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k), viscoelasticity/relaxation (c), and load (xo). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2, maximum A-V pressure gradient kxo, load independent index of diastolic function, etc.) and select the aspect of physiology or pathophysiology to be quantified. PMID:25226101

  2. Quantification of global diastolic function by kinematic modeling-based analysis of transmitral flow via the parametrized diastolic filling formalism.

    PubMed

    Mossahebi, Sina; Zhu, Simeng; Chen, Howard; Shmuylovich, Leonid; Ghosh, Erina; Kovács, Sándor J

    2014-09-01

    Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k), viscoelasticity/relaxation (c), and load (xo). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo(2), maximum A-V pressure gradient kxo, load independent index of diastolic function, etc.) and select the aspect of physiology or pathophysiology to be quantified.

  3. Impact of TRMM and SSM/I Rainfall Assimilation on Global Analysis and QPF

    NASA Technical Reports Server (NTRS)

    Hou, Arthur; Zhang, Sara; Reale, Oreste

    2002-01-01

    Evaluation of QPF skills requires quantitatively accurate precipitation analyses. We show that assimilation of surface rain rates derived from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager and Special Sensor Microwave/Imager (SSM/I) improves quantitative precipitation estimates (QPE) and many aspects of global analyses. Short-range forecasts initialized with analyses with satellite rainfall data generally yield significantly higher QPF threat scores and better storm track predictions. These results were obtained using a variational procedure that minimizes the difference between the observed and model rain rates by correcting the moist physics tendency of the forecast model over a 6h assimilation window. In two case studies of Hurricanes Bonnie and Floyd, synoptic analysis shows that this procedure produces initial conditions with better-defined tropical storm features and stronger precipitation intensity associated with the storm.

  4. How structurally stable are global socioeconomic systems?

    PubMed Central

    Saavedra, Serguei; Rohr, Rudolf P.; Gilarranz, Luis J.; Bascompte, Jordi

    2014-01-01

    The stability analysis of socioeconomic systems has been centred on answering whether small perturbations when a system is in a given quantitative state will push the system permanently to a different quantitative state. However, typically the quantitative state of socioeconomic systems is subject to constant change. Therefore, a key stability question that has been under-investigated is how strongly the conditions of a system itself can change before the system moves to a qualitatively different behaviour, i.e. how structurally stable the systems is. Here, we introduce a framework to investigate the structural stability of socioeconomic systems formed by a network of interactions among agents competing for resources. We measure the structural stability of the system as the range of conditions in the distribution and availability of resources compatible with the qualitative behaviour in which all the constituent agents can be self-sustained across time. To illustrate our framework, we study an empirical representation of the global socioeconomic system formed by countries sharing and competing for multinational companies used as proxy for resources. We demonstrate that the structural stability of the system is inversely associated with the level of competition and the level of heterogeneity in the distribution of resources. Importantly, we show that the qualitative behaviour of the observed global socioeconomic system is highly sensitive to changes in the distribution of resources. We believe that this work provides a methodological basis to develop sustainable strategies for socioeconomic systems subject to constantly changing conditions. PMID:25165600

  5. A protein-dependent side-chain rotamer library.

    PubMed

    Bhuyan, Md Shariful Islam; Gao, Xin

    2011-12-14

    Protein side-chain packing problem has remained one of the key open problems in bioinformatics. The three main components of protein side-chain prediction methods are a rotamer library, an energy function and a search algorithm. Rotamer libraries summarize the existing knowledge of the experimentally determined structures quantitatively. Depending on how much contextual information is encoded, there are backbone-independent rotamer libraries and backbone-dependent rotamer libraries. Backbone-independent libraries only encode sequential information, whereas backbone-dependent libraries encode both sequential and locally structural information. However, side-chain conformations are determined by spatially local information, rather than sequentially local information. Since in the side-chain prediction problem, the backbone structure is given, spatially local information should ideally be encoded into the rotamer libraries. In this paper, we propose a new type of backbone-dependent rotamer library, which encodes structural information of all the spatially neighboring residues. We call it protein-dependent rotamer libraries. Given any rotamer library and a protein backbone structure, we first model the protein structure as a Markov random field. Then the marginal distributions are estimated by the inference algorithms, without doing global optimization or search. The rotamers from the given library are then re-ranked and associated with the updated probabilities. Experimental results demonstrate that the proposed protein-dependent libraries significantly outperform the widely used backbone-dependent libraries in terms of the side-chain prediction accuracy and the rotamer ranking ability. Furthermore, without global optimization/search, the side-chain prediction power of the protein-dependent library is still comparable to the global-search-based side-chain prediction methods.

  6. Does quantitative left ventricular regional wall motion change after fibrous tissue resection in endomyocardial fibrosis?

    PubMed

    Salemi, Vera Maria Cury; Fernandes, Fabio; Sirvente, Raquel; Nastari, Luciano; Rosa, Leonardo Vieira; Ferreira, Cristiano A; Pena, José Luiz Barros; Picard, Michael H; Mady, Charles

    2009-01-01

    We compared left ventricular regional wall motion, the global left ventricular ejection fraction, and the New York Heart Association functional class pre- and postoperatively. Endomyocardial fibrosis is characterized by fibrous tissue deposition in the endomyocardium of the apex and/or inflow tract of one or both ventricles. Although left ventricular global systolic function is preserved, patients exhibit wall motion abnormalities in the apical and inferoapical regions. Fibrous tissue resection in New York Heart Association FC III and IV endomyocardial fibrosis patients has been shown to decrease morbidity and mortality. We prospectively studied 30 patients (20 female, 30+/-10 years) before and 5+/-8 months after surgery. The left ventricular ejection fraction was determined using the area-length method. Regional left ventricular motion was measured by the centerline method. Five left ventricular segments were analyzed pre- and postoperatively. Abnormality was expressed in units of standard deviation from the mean motion in a normal reference population. Left ventricular wall motion in the five regions did not differ between pre- and postoperative measurements. Additionally, the left ventricular ejection fraction did not change after surgery (0.45+/-0.13% x 0.43+/-0.12% pre- and postoperatively, respectively). The New York Heart Association functional class improved to class I in 40% and class II in 43% of patients postoperatively (p<0.05). Although endomyocardial fibrosis patients have improved clinical symptoms after surgery, the global left ventricular ejection fraction and regional wall motion in these patients do not change. This finding suggests that other explanations, such as improvements in diastolic function, may be operational.

  7. Global Auroral Energy Deposition during Substorm Onset Compared with Local Time and Solar Wind IMF Conditions

    NASA Technical Reports Server (NTRS)

    Spann, J. F.; Brittnacher, M.; Fillingim, M. O.; Germany, G. A.; Parks, G. K.

    1998-01-01

    The global images made by the Ultraviolet Imager (UVI) aboard the IASTP/Polar Satellite are used to derive the global auroral energy deposited in the ionosphere resulting from electron precipitation. During a substorm onset, the energy deposited and its location in local time are compared to the solar wind IMF conditions. Previously, insitu measurements of low orbiting satellites have made precipitating particle measurements along the spacecraft track and global images of the auroral zone, without the ability to quantify energy parameters, have been available. However, usage of the high temporal, spatial, and spectral resolution of consecutive UVI images enables quantitative measurement of the energy deposited in the ionosphere not previously available on a global scale. Data over an extended period beginning in January 1997 will be presented.

  8. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  9. Multiscale geomorphometric modeling of Mercury

    NASA Astrophysics Data System (ADS)

    Florinsky, I. V.

    2018-02-01

    Topography is one of the key characteristics of a planetary body. Geomorphometry deals with quantitative modeling and analysis of the topographic surface and relationships between topography and other natural components of landscapes. The surface of Mercury is systematically studied by interpretation of images acquired during the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission. However, the Mercurian surface is still little explored by methods of geomorphometry. In this paper, we evaluate the Mercury MESSENGER Global DEM MSGR_DEM_USG_SC_I_V02 - a global digital elevation model (DEM) of Mercury with the resolution of 0.015625° - as a source for geomorphometric modeling of this planet. The study was performed at three spatial scales: the global, regional (the Caloris basin), and local (the Pantheon Fossae area) ones. As the initial data, we used three DEMs of these areas with resolutions of 0.25°, 0.0625°, and 0.015625°, correspondingly. The DEMs were extracted from the MESSENGER Global DEM. From the DEMs, we derived digital models of several fundamental morphometric variables, such as: slope gradient, horizontal curvature, vertical curvature, minimal curvature, maximal curvature, catchment area, and dispersive area. The morphometric maps obtained represent peculiarities of the Mercurian topography in different ways, according to the physical and mathematical sense of a particular variable. Geomorphometric models are a rich source of information on the Mercurian surface. These data can be utilized to study evolution and internal structure of the planet, for example, to visualize and quantify regional topographic differences as well as to refine geological boundaries.

  10. Design and package of a {sup 14}CO{sub 2} field analyzer The Global Monitor Platform (GMP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bright, Michelle; Marino, Bruno D.V.; Gronniger, Glen

    2011-08-01

    Carbon Capture and Sequestration (CCS) is widely accepted as a means to reduce and eliminate the fossil fuel CO{sub 2} (ff- CO{sub 2}) emissions from coal fired power plants. Success of CCS depends on near zero leakage rates over decadal time scales. Currently no commercial methods to determine leakage of ff-CO{sub 2} are available. The Global Monitor Platform (GMP) field analyzer provides high precision analysis of CO{sub 2} isotopes [12C (99%), 13C (<1%), 14C (1.2x10-10 %)] that can differentiate between fossil and biogenic CO{sub 2} emissions. Fossil fuels contain no {sup 14}C; their combustion should lower atmospheric amounts on localmore » to global scales. There is a clear mandate for monitoring, verification and accounting (MVA) of CCS systems nationally and globally to verify CCS integrity, treaty verification (Kyoto Protocol) and to characterize the nuclear fuel cycle. Planetary Emissions Management (PEM), working with the National Secure Manufacturing Center (NSMC), has the goal of designing, ruggedizing and packaging the GMP for field deployment. The system will conduct atmosphere monitoring then adapt the system to monitor water and soil evaluations. Measuring {sup 14}CO{sub 2} in real time will provide quantitative concentration data for ff-CO{sub 2} in the atmosphere and CCS leakage detection. Initial results will be discussed along with design changes for improved detection sensitivity and manufacturability.« less

  11. In situ structure and dynamics of DNA origami determined through molecular dynamics simulations

    PubMed Central

    Yoo, Jejoong; Aksimentiev, Aleksei

    2013-01-01

    The DNA origami method permits folding of long single-stranded DNA into complex 3D structures with subnanometer precision. Transmission electron microscopy, atomic force microscopy, and recently cryo-EM tomography have been used to characterize the properties of such DNA origami objects, however their microscopic structures and dynamics have remained unknown. Here, we report the results of all-atom molecular dynamics simulations that characterized the structural and mechanical properties of DNA origami objects in unprecedented microscopic detail. When simulated in an aqueous environment, the structures of DNA origami objects depart from their idealized targets as a result of steric, electrostatic, and solvent-mediated forces. Whereas the global structural features of such relaxed conformations conform to the target designs, local deformations are abundant and vary in magnitude along the structures. In contrast to their free-solution conformation, the Holliday junctions in the DNA origami structures adopt a left-handed antiparallel conformation. We find the DNA origami structures undergo considerable temporal fluctuations on both local and global scales. Analysis of such structural fluctuations reveals the local mechanical properties of the DNA origami objects. The lattice type of the structures considerably affects global mechanical properties such as bending rigidity. Our study demonstrates the potential of all-atom molecular dynamics simulations to play a considerable role in future development of the DNA origami field by providing accurate, quantitative assessment of local and global structural and mechanical properties of DNA origami objects. PMID:24277840

  12. In situ structure and dynamics of DNA origami determined through molecular dynamics simulations.

    PubMed

    Yoo, Jejoong; Aksimentiev, Aleksei

    2013-12-10

    The DNA origami method permits folding of long single-stranded DNA into complex 3D structures with subnanometer precision. Transmission electron microscopy, atomic force microscopy, and recently cryo-EM tomography have been used to characterize the properties of such DNA origami objects, however their microscopic structures and dynamics have remained unknown. Here, we report the results of all-atom molecular dynamics simulations that characterized the structural and mechanical properties of DNA origami objects in unprecedented microscopic detail. When simulated in an aqueous environment, the structures of DNA origami objects depart from their idealized targets as a result of steric, electrostatic, and solvent-mediated forces. Whereas the global structural features of such relaxed conformations conform to the target designs, local deformations are abundant and vary in magnitude along the structures. In contrast to their free-solution conformation, the Holliday junctions in the DNA origami structures adopt a left-handed antiparallel conformation. We find the DNA origami structures undergo considerable temporal fluctuations on both local and global scales. Analysis of such structural fluctuations reveals the local mechanical properties of the DNA origami objects. The lattice type of the structures considerably affects global mechanical properties such as bending rigidity. Our study demonstrates the potential of all-atom molecular dynamics simulations to play a considerable role in future development of the DNA origami field by providing accurate, quantitative assessment of local and global structural and mechanical properties of DNA origami objects.

  13. Can we use ground-based measurements of HCFCs and HFCs to derive their emissions, lifetimes, and the global OH abundance?

    NASA Astrophysics Data System (ADS)

    Liang, Q.; Chipperfield, M.; Daniel, J. S.; Burkholder, J. B.; Rigby, M. L.; Velders, G. J. M.

    2015-12-01

    The hydroxyl radical (OH) is the major oxidant in the atmosphere. Reaction with OH is the primary removal process for many non-CO2greenhouse gases (GHGs), ozone-depleting substances (ODSs) and their replacements, e.g. hydrochlorofluorocarbons (HCFCs) and hydrofluorocarbons (HFCs). Traditionally, the global OH abundance is inferred using the observed atmospheric rate of change for methyl chloroform (MCF). Due to the Montreal Protocol regulation, the atmospheric abundance of MCF has been decreasing rapidly to near-zero values. It is becoming critical to find an alternative reference compound to continue to provide quantitative information for the global OH abundance. Our model analysis using the NASA 3-D GEOS-5 Chemistry Climate Model suggests that the inter-hemispheric gradients (IHG) of the HCFCs and HFCs show a strong linear correlation with their global emissions. Therefore it is possible to use (i) the observed IHGs of HCFCs and HFCs to estimate their global emissions, and (ii) use the derived emissions and the observed long-term trend to calculate their lifetimes and to infer the global OH abundance. Preliminary analysis using a simple global two-box model (one box for each hemisphere) and information from the global 3-D model suggests that the quantitative relationship between IHG and global emissions varies slightly among individual compounds depending on their lifetime, their emissions history and emission fractions from the two hemispheres. While each compound shows different sensitivity to the above quantities, the combined suite of the HCFCs and HFCs provides a means to derive global OH abundance and the corresponding atmospheric lifetimes of long-lived gases with respect to OH (tOH). The fact that the OH partial lifetimes of these compounds are highly correlated, with the ratio of tOH equal to the reverse ratio of their OH thermal reaction rates at 272K, provides an additional constraint that can greatly reduce the uncertainty in the OH abundance and tOH estimates. We will use the observed IHGs and long-term trends of three major HCFCs and six major HFCs in the two-box model to derive their global emissions and atmospheric lifetimes as well as the global OH abundance. The derived global OH abundance between 2000 and 2014 will be compared with that derived using MCF for consistency.

  14. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    PubMed

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  15. Protocol for process evaluation of a randomised controlled trial of family-led rehabilitation post stroke (ATTEND) in India

    PubMed Central

    Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen

    2016-01-01

    Introduction We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. Methods and analysis The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. Ethics and dissemination The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. Trial registration number CTRI/2013/04/003557. PMID:27633636

  16. Influence of neighbourhood information on 'Local Climate Zone' mapping in heterogeneous cities

    NASA Astrophysics Data System (ADS)

    Verdonck, Marie-Leen; Okujeni, Akpona; van der Linden, Sebastian; Demuzere, Matthias; De Wulf, Robert; Van Coillie, Frieke

    2017-10-01

    Local climate zone (LCZ) mapping is an emerging field in urban climate research. LCZs potentially provide an objective framework to assess urban form and function worldwide. The scheme is currently being used to globally map LCZs as a part of the World Urban Database and Access Portal Tools (WUDAPT) initiative. So far, most of the LCZ maps lack proper quantitative assessment, challenging the generic character of the WUDAPT workflow. Using the standard method introduced by the WUDAPT community difficulties arose concerning the built zones due to high levels of heterogeneity. To overcome this problem a contextual classifier is adopted in the mapping process. This paper quantitatively assesses the influence of neighbourhood information on the LCZ mapping result of three cities in Belgium: Antwerp, Brussels and Ghent. Overall accuracies for the maps were respectively 85.7 ± 0.5, 79.6 ± 0.9, 90.2 ± 0.4%. The approach presented here results in overall accuracies of 93.6 ± 0.2, 92.6 ± 0.3 and 95.6 ± 0.3% for Antwerp, Brussels and Ghent. The results thus indicate a positive influence of neighbourhood information for all study areas with an increase in overall accuracies of 7.9, 13.0 and 5.4%. This paper reaches two main conclusions. Firstly, evidence was introduced on the relevance of a quantitative accuracy assessment in LCZ mapping, showing that the accuracies reported in previous papers are not easily achieved. Secondly, the method presented in this paper proves to be highly effective in Belgian cities, and given its open character shows promise for application in other heterogeneous cities worldwide.

  17. A Novel Pulse-Chase SILAC Strategy Measures Changes in Protein Decay and Synthesis Rates Induced by Perturbation of Proteostasis with an Hsp90 Inhibitor

    PubMed Central

    Fierro-Monti, Ivo; Racle, Julien; Hernandez, Celine; Waridel, Patrice; Hatzimanikatis, Vassily; Quadroni, Manfredo

    2013-01-01

    Standard proteomics methods allow the relative quantitation of levels of thousands of proteins in two or more samples. While such methods are invaluable for defining the variations in protein concentrations which follow the perturbation of a biological system, they do not offer information on the mechanisms underlying such changes. Expanding on previous work [1], we developed a pulse-chase (pc) variant of SILAC (stable isotope labeling by amino acids in cell culture). pcSILAC can quantitate in one experiment and for two conditions the relative levels of proteins newly synthesized in a given time as well as the relative levels of remaining preexisting proteins. We validated the method studying the drug-mediated inhibition of the Hsp90 molecular chaperone, which is known to lead to increased synthesis of stress response proteins as well as the increased decay of Hsp90 “clients”. We showed that pcSILAC can give information on changes in global cellular proteostasis induced by treatment with the inhibitor, which are normally not captured by standard relative quantitation techniques. Furthermore, we have developed a mathematical model and computational framework that uses pcSILAC data to determine degradation constants kd and synthesis rates Vs for proteins in both control and drug-treated cells. The results show that Hsp90 inhibition induced a generalized slowdown of protein synthesis and an increase in protein decay. Treatment with the inhibitor also resulted in widespread protein-specific changes in relative synthesis rates, together with variations in protein decay rates. The latter were more restricted to individual proteins or protein families than the variations in synthesis. Our results establish pcSILAC as a viable workflow for the mechanistic dissection of changes in the proteome which follow perturbations. Data are available via ProteomeXchange with identifier PXD000538. PMID:24312217

  18. Reliability of electromagnetic induction data in near surface application

    NASA Astrophysics Data System (ADS)

    Nüsch, A.; Werban, U.; Sauer, U.; Dietrich, P.

    2012-12-01

    Use of the Electromagnetic Induction method for measuring electrical conductivities is widespread in applied geosciences, since the method is easy to perform and influenced by soil parameters. The vast amount of different applications of EMI measurements for different spatial resolutions as well as for the derivation of different soil parameters necessitates a unified handling of EMI data. So the requirements to the method have been changed from a qualitative overview to a quantitative use of data. A quantitative treatment of the data however is limited by the available instruments, which were made only for qualitative use. Nevertheless the limitations of the method can be expanded by considering a few conditions. In this study, we introduce possibilities for enhancing the quality of EMI data with regards to large scale investigations. In a set of systematic investigations, we show which aspects have to be taken into account when using a commercially available instrument, related to long term stability, comparability and repeatability. In-depth knowledge of the instruments used concerning aspects such as their calibration procedure, long term stability, battery life and thermal behaviour is an essential pre-requisite before starting the measurement process. A further aspect highlighted is quality control during measurements and if necessary a subsequent data correction which is pre-requisite for a quantitative analysis of the data. Quality control during the measurement process is crucial. Before a measurement starts, it is recommended that a short term test is carried out on-site to check environmental noise. Signal to noise ratio is a decisive influencing factor of whether or not the method is applicable at the chosen field site. A measurement needs to be monitored according to possible drifts. This can be achieved with different accuracies and starting from a quality check, with the help of reference lines up to a quantitative control with reference points. Further global reference lines are necessary if measurements take place at the landscape scale. In some cases, it is possible to eliminate drifts that may occur by using a data correction based on binding lines. The suggested procedure can raise the explanatory power of the data enormously and artefacts caused by drifts or inadequate handling are minimized. This work was supported by iSOIL - Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping, which is a Collaborative Project (Grant Agreement number 211386) co-funded by the Research DG of the European Commission within the RTD activities of the FP7 Thematic Priority Environment; iSOIL is one member of the SOIL TECHNOLOGY CLUSTER of Research Projects funded by the EC.

  19. 78 FR 70059 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... (as opposed to quantitative statistical methods). In consultation with research experts, we have... qualitative interviews (as opposed to quantitative statistical methods). In consultation with research experts... utilization of qualitative interviews (as opposed to quantitative statistical methods). In consultation with...

  20. The impact of economic, political and social globalization on overweight and obesity in the 56 low and middle income countries.

    PubMed

    Goryakin, Yevgeniy; Lobstein, Tim; James, W Philip T; Suhrcke, Marc

    2015-05-01

    Anecdotal and descriptive evidence has led to the claim that globalization plays a major role in inducing overweight and obesity in developing countries, but robust quantitative evidence is scarce. We undertook extensive econometric analyses of several datasets, using a series of new proxies for different dimensions of globalization potentially affecting overweight in up to 887,000 women aged 15-49 living in 56 countries between 1991 and 2009. After controlling for relevant individual and country level factors, globalization as a whole is substantially and significantly associated with an increase in the individual propensity to be overweight among women. Surprisingly, political and social globalization dominate the influence of the economic dimension. Hence, more consideration needs to be given to the forms of governance required to shape a more health-oriented globalization process. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. The impact of economic, political and social globalization on overweight and obesity in the 56 low and middle income countries

    PubMed Central

    Goryakin, Yevgeniy; Lobstein, Tim; James, W. Philip T.; Suhrcke, Marc

    2015-01-01

    Anecdotal and descriptive evidence has led to the claim that globalization plays a major role in inducing overweight and obesity in developing countries, but robust quantitative evidence is scarce. We undertook extensive econometric analyses of several datasets, using a series of new proxies for different dimensions of globalization potentially affecting overweight in up to 887,000 women aged 15–49 living in 56 countries between 1991 and 2009. After controlling for relevant individual and country level factors, globalization as a whole is substantially and significantly associated with an increase in the individual propensity to be overweight among women. Surprisingly, political and social globalization dominate the influence of the economic dimension. Hence, more consideration needs to be given to the forms of governance required to shape a more health-oriented globalization process. PMID:25841097

  2. A quantitative PCR approach for quantification of functional genes involved in the degradation of polycyclic aromatic hydrocarbons in contaminated soils.

    PubMed

    Shahsavari, Esmaeil; Aburto-Medina, Arturo; Taha, Mohamed; Ball, Andrew S

    2016-01-01

    Polycyclic aromatic hydrocarbons (PAHs) are major pollutants globally and due to their carcinogenic and mutagenic properties their clean-up is paramount. Bioremediation or using PAH degrading microorganisms (mainly bacteria) to degrade the pollutants represents cheap, effective methods. These PAH degraders harbor functional genes which help microorganisms use PAHs as source of food and energy. Most probable number (MPN) and plate counting methods are widely used for counting PAHs degraders; however, as culture based methods only count a small fraction (<1%) of microorganisms capable of carrying out PAH degradation, the use of culture-independent methodologies is desirable.•This protocol presents a robust, rapid and sensitive qPCR method for the quantification of the functional genes involved in the degradation of PAHs in soil samples.•This protocol enables us to screen a vast number of PAH contaminated soil samples in few hours.•This protocol provides valuable information about the natural attenuation potential of contaminated soil and can be used to monitor the bioremediation process.

  3. Analytical challenges in drug counterfeiting and falsification-The NMR approach.

    PubMed

    Holzgrabe, Ulrike; Malet-Martino, Myriam

    2011-06-25

    Counterfeiting of products is a global problem. As long as clothes, clocks, leather wear, etc. are faked there is no danger, but when it comes to drugs, counterfeiting can be life-threatening. In the last years sub-standard active pharmaceutical ingredients (APIs) were found more often even though the use of the quality-ensuring methods of international pharmacopoeias should have detected additional impurities and the low content of the API. Methods orthogonal to the separating methods used in the pharmacopoeias are necessary to find counterfeits. Beside Raman and NIR spectroscopies as well as powder X-ray analysis, NMR spectroscopy being a primary ratio method of measurement is highly suitable to identify and quantify a drug and its related substances as well as to recognize a drug of sub-standard quality. DOSY experiments are suitable to identify the ingredients of formulations and therefore to identify wrong and/or additional ingredients. This review gives an overview of the application of quantitative NMR spectroscopy and DOSY NMR in anticounterfeiting. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Parenting and childhood obesity research: a quantitative content analysis of published research 2009-2015.

    PubMed

    Gicevic, S; Aftosmes-Tobio, A; Manganello, J A; Ganter, C; Simon, C L; Newlan, S; Davison, K K

    2016-08-01

    A quantitative content analysis of research on parenting and childhood obesity was conducted to describe the recent literature and to identify gaps to address in future research. Studies were identified from multiple databases and screened according to an a priori defined protocol. Eligible studies included non-intervention studies, published in English (January 2009-December 2015) that focused on parenting and childhood obesity and included parent participants. Studies eligible for inclusion (N = 667) focused on diet (57%), physical activity (23%) and sedentary behaviours (12%). The vast majority of studies used quantitative methods (80%) and a cross-sectional design (86%). Few studies focused exclusively on fathers (1%) or included non-residential (1%), non-biological (4%), indigenous (1%), immigrant (7%), ethnic/racial minority (15%) or low-socioeconomic status (19%) parents. While results illustrate that parenting in the context of childhood obesity is a robust, global and multidisciplinary area of inquiry, it is also evident that the vast majority of studies are conducted among Caucasian, female, biological caregivers living in westernized countries. Expansion of study foci and design is recommended to capture a wider range of caregiver types and obesity-related parenting constructs, improve the validity and generalizability of findings and inform the development of culture-specific childhood obesity prevention interventions and policies. © 2016 World Obesity. © 2016 World Obesity.

  5. Mapping complex traits as a dynamic system

    PubMed Central

    Sun, Lidan; Wu, Rongling

    2017-01-01

    Despite increasing emphasis on the genetic study of quantitative traits, we are still far from being able to chart a clear picture of their genetic architecture, given an inherent complexity involved in trait formation. A competing theory for studying such complex traits has emerged by viewing their phenotypic formation as a “system” in which a high-dimensional group of interconnected components act and interact across different levels of biological organization from molecules through cells to whole organisms. This system is initiated by a machinery of DNA sequences that regulate a cascade of biochemical pathways to synthesize endophenotypes and further assemble these endophenotypes toward the end-point phenotype in virtue of various developmental changes. This review focuses on a conceptual framework for genetic mapping of complex traits by which to delineate the underlying components, interactions and mechanisms that govern the system according to biological principles and understand how these components function synergistically under the control of quantitative trait loci (QTLs) to comprise a unified whole. This framework is built by a system of differential equations that quantifies how alterations of different components lead to the global change of trait development and function, and provides a quantitative and testable platform for assessing the multiscale interplay between QTLs and development. The method will enable geneticists to shed light on the genetic complexity of any biological system and predict, alter or engineer its physiological and pathological states. PMID:25772476

  6. Understanding Factors that Shape Gender Attitudes in Early Adolescence Globally: A Mixed-Methods Systematic Review

    PubMed Central

    Gibbs, Susannah; Blum, Robert Wm; Moreau, Caroline; Chandra-Mouli, Venkatraman; Herbert, Ann; Amin, Avni

    2016-01-01

    Background Early adolescence (ages 10–14) is a period of increased expectations for boys and girls to adhere to socially constructed and often stereotypical norms that perpetuate gender inequalities. The endorsement of such gender norms is closely linked to poor adolescent sexual and reproductive and other health-related outcomes yet little is known about the factors that influence young adolescents’ personal gender attitudes. Objectives To explore factors that shape gender attitudes in early adolescence across different cultural settings globally. Methods A mixed-methods systematic review was conducted of the peer-reviewed literature in 12 databases from 1984–2014. Four reviewers screened the titles and abstracts of articles and reviewed full text articles in duplicate. Data extraction and quality assessments were conducted using standardized templates by study design. Thematic analysis was used to synthesize quantitative and qualitative data organized by the social-ecological framework (individual, interpersonal and community/societal-level factors influencing gender attitudes). Results Eighty-two studies (46 quantitative, 31 qualitative, 5 mixed-methods) spanning 29 countries were included. Ninety percent of studies were from North America or Western Europe. The review findings indicate that young adolescents, across cultural settings, commonly express stereotypical or inequitable gender attitudes, and such attitudes appear to vary by individual sociodemographic characteristics (sex, race/ethnicity and immigration, social class, and age). Findings highlight that interpersonal influences (family and peers) are central influences on young adolescents’ construction of gender attitudes, and these gender socialization processes differ for boys and girls. The role of community factors (e.g. media) is less clear though there is some evidence that schools may reinforce stereotypical gender attitudes among young adolescents. Conclusions The findings from this review suggest that young adolescents in different cultural settings commonly endorse norms that perpetuate gender inequalities, and that parents and peers are especially central in shaping such attitudes. Programs to promote equitable gender attitudes thus need to move beyond a focus on individuals to target their interpersonal relationships and wider social environments. Such programs need to start early and be tailored to the unique needs of sub-populations of boys and girls. Longitudinal studies, particularly from low-and middle-income countries, are needed to better understand how gender attitudes unfold in adolescence and to identify the key points for intervention. PMID:27341206

  7. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    NASA Astrophysics Data System (ADS)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  8. "Quitting like a Turk:" How political priority developed for tobacco control in Turkey.

    PubMed

    Hoe, Connie; Rodriguez, Daniela C; Üzümcüoğlu, Yeşim; Hyder, Adnan A

    2016-09-01

    In recent years, tobacco control emerged as a political priority in Turkey and today the country is widely regarded as one of the global leaders in tackling tobacco use. Although political priority is considered a facilitating factor to the success of addressing public health issues, there is a paucity of research to help us understand how it is developed in middle-income countries. The primary aim of this study is to understand the process and determinants of how tobacco control became a political priority in Turkey using the Multiple Streams Framework. A mixed-methods case study approach was used whereby data were gathered from three different sources: in-depth interviews (N = 19), document reviews (N = 216), and online self-administered surveys (N = 61). Qualitative data were collected for the purpose of understanding the processes and determinants that led to political prioritization of tobacco control and were analyzed using deductive and inductive coding. Quantitative data were collected to examine the actors and were analyzed using descriptive statistics and network nominations. Data were triangulated. Findings revealed that tobacco control achieved political priority in Turkey as a result of the development and convergence of multiple streams, including a fourth, separate global stream. Findings also shed light on the importance of Turkey's foreign policy in the transformation of the political stream. The country's desire for European Union accession and global visibility helped generate a political environment that was receptive to global norms for tobacco control. A diverse but cohesive network of actors joined forces with global allies to capitalize on this opportunity. Results suggest (1) the importance of global-agenda setting activities on political priority development, (2) the utility of aligning public health and foreign policy goals and (3) the need to build a strong global incentive structure to help entice governments to take action on public health issues. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A Piecewise Local Partial Least Squares (PLS) Method for the Quantitative Analysis of Plutonium Nitrate Solutions

    DOE PAGES

    Lascola, Robert; O'Rourke, Patrick E.; Kyser, Edward A.

    2017-10-05

    Here, we have developed a piecewise local (PL) partial least squares (PLS) analysis method for total plutonium measurements by absorption spectroscopy in nitric acid-based nuclear material processing streams. Instead of using a single PLS model that covers all expected solution conditions, the method selects one of several local models based on an assessment of solution absorbance, acidity, and Pu oxidation state distribution. The local models match the global model for accuracy against the calibration set, but were observed in several instances to be more robust to variations associated with measurements in the process. The improvements are attributed to the relativemore » parsimony of the local models. Not all of the sources of spectral variation are uniformly present at each part of the calibration range. Thus, the global model is locally overfitting and susceptible to increased variance when presented with new samples. A second set of models quantifies the relative concentrations of Pu(III), (IV), and (VI). Standards containing a mixture of these species were not at equilibrium due to a disproportionation reaction. Therefore, a separate principal component analysis is used to estimate of the concentrations of the individual oxidation states in these standards in the absence of independent confirmatory analysis. The PL analysis approach is generalizable to other systems where the analysis of chemically complicated systems can be aided by rational division of the overall range of solution conditions into simpler sub-regions.« less

  10. Convergence of neural networks for programming problems via a nonsmooth Lojasiewicz inequality.

    PubMed

    Forti, Mauro; Nistri, Paolo; Quincampoix, Marc

    2006-11-01

    This paper considers a class of neural networks (NNs) for solving linear programming (LP) problems, convex quadratic programming (QP) problems, and nonconvex QP problems where an indefinite quadratic objective function is subject to a set of affine constraints. The NNs are characterized by constraint neurons modeled by ideal diodes with vertical segments in their characteristic, which enable to implement an exact penalty method. A new method is exploited to address convergence of trajectories, which is based on a nonsmooth Lojasiewicz inequality for the generalized gradient vector field describing the NN dynamics. The method permits to prove that each forward trajectory of the NN has finite length, and as a consequence it converges toward a singleton. Furthermore, by means of a quantitative evaluation of the Lojasiewicz exponent at the equilibrium points, the following results on convergence rate of trajectories are established: (1) for nonconvex QP problems, each trajectory is either exponentially convergent, or convergent in finite time, toward a singleton belonging to the set of constrained critical points; (2) for convex QP problems, the same result as in (1) holds; moreover, the singleton belongs to the set of global minimizers; and (3) for LP problems, each trajectory converges in finite time to a singleton belonging to the set of global minimizers. These results, which improve previous results obtained via the Lyapunov approach, are true independently of the nature of the set of equilibrium points, and in particular they hold even when the NN possesses infinitely many nonisolated equilibrium points.

  11. A Piecewise Local Partial Least Squares (PLS) Method for the Quantitative Analysis of Plutonium Nitrate Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lascola, Robert; O'Rourke, Patrick E.; Kyser, Edward A.

    Here, we have developed a piecewise local (PL) partial least squares (PLS) analysis method for total plutonium measurements by absorption spectroscopy in nitric acid-based nuclear material processing streams. Instead of using a single PLS model that covers all expected solution conditions, the method selects one of several local models based on an assessment of solution absorbance, acidity, and Pu oxidation state distribution. The local models match the global model for accuracy against the calibration set, but were observed in several instances to be more robust to variations associated with measurements in the process. The improvements are attributed to the relativemore » parsimony of the local models. Not all of the sources of spectral variation are uniformly present at each part of the calibration range. Thus, the global model is locally overfitting and susceptible to increased variance when presented with new samples. A second set of models quantifies the relative concentrations of Pu(III), (IV), and (VI). Standards containing a mixture of these species were not at equilibrium due to a disproportionation reaction. Therefore, a separate principal component analysis is used to estimate of the concentrations of the individual oxidation states in these standards in the absence of independent confirmatory analysis. The PL analysis approach is generalizable to other systems where the analysis of chemically complicated systems can be aided by rational division of the overall range of solution conditions into simpler sub-regions.« less

  12. Migration of Nurses from Sub-Saharan Africa: A Review of Issues and Challenges

    PubMed Central

    Dovlo, Delanyo

    2007-01-01

    Objective To assess the impact of out-migration of nurses on the health systems in sub-Saharan Africa (SSA). Setting The countries of SSA. Design and Methods Review of secondary sources: existing publications and country documents on the health workforce; documents prepared for the Joint Learning Initiative Global Human Resources for Health report, the World Health Organization (AFRO) synthesis on migration, and the International Council of Nurses series on the global nursing situation. Analysis of associated data. Principal Findings The state of nursing practice in SSA appears to have been impacted negatively by migration. Available (though inadequate) quantitative data on stocks and flows, qualitative information on migration issues and trends, and on the main strategies being employed in both source and recipient countries indicate that the problem is likely to grow over the next 5–10 years. Conclusions Multiple actions are needed at various policy levels in both source and receiving countries to moderate negative effects of nurse emigration in developing countries in Africa; however, critically, source countries must establish more effective policies and strategies. PMID:17489920

  13. Mapping Residency Global Health Experiences to the ACGME Family Medicine Milestones.

    PubMed

    Grissom, Maureen O; Iroku-Malize, Tochi; Peila, Rita; Perez, Marco; Philippe, Neubert

    2017-07-01

    Global health (GH) experiences are a unique part of family medicine (FM) training that offer an opportunity for residents to demonstrate development across a multitude of the milestones recently implemented by the Accreditation Council for Graduate Medical Education (ACGME). The GH experience presents an opportunity for resident development, and including a component of written reflection can provide tangible evidence of development in areas that can be difficult to assess. A mixed methods approach was used to integrate quantitative (frequency) data with qualitative content from the written reflections of 12 of our FM residents who participated in GH experiences. Written reflections touched on each of the 22 milestones, although some milestones were noted more frequently than others. The most commonly identified milestones fell within the competency areas of systems-based practice, professionalism, and practice-based learning and improvement. Our qualitative approach allowed us to gain an appreciation of the unique experiences that demonstrated growth across the various milestones. We conclude that any program that offers GH experiences should incorporate some form of written reflection to maximize resident growth and offer evaluative faculty a window into that development.

  14. Sensitivity analysis of Repast computational ecology models with R/Repast.

    PubMed

    Prestes García, Antonio; Rodríguez-Patón, Alfonso

    2016-12-01

    Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.

  15. A Computational Approach to Estimate Interorgan Metabolic Transport in a Mammal

    PubMed Central

    Cui, Xiao; Geffers, Lars; Eichele, Gregor; Yan, Jun

    2014-01-01

    In multicellular organisms metabolism is distributed across different organs, each of which has specific requirements to perform its own specialized task. But different organs also have to support the metabolic homeostasis of the organism as a whole by interorgan metabolite transport. Recent studies have successfully reconstructed global metabolic networks in tissues and cell types and attempts have been made to connect organs with interorgan metabolite transport. Instead of these complicated approaches to reconstruct global metabolic networks, we proposed in this study a novel approach to study interorgan metabolite transport focusing on transport processes mediated by solute carrier (Slc) transporters and their couplings to cognate enzymatic reactions. We developed a computational approach to identify and score potential interorgan metabolite transports based on the integration of metabolism and transports in different organs in the adult mouse from quantitative gene expression data. This allowed us to computationally estimate the connectivity between 17 mouse organs via metabolite transport. Finally, by applying our method to circadian metabolism, we showed that our approach can shed new light on the current understanding of interorgan metabolite transport at a whole-body level in mammals. PMID:24971892

  16. Factors that influencing the usage of global distribution system

    NASA Astrophysics Data System (ADS)

    Budiasa, I. M.; Suparta, I. K.; Nadra, N. M.

    2018-01-01

    The advancement of Tourism is supported by Information and Communication Technology (ICT) innovation and changes. The use of GDS (Global Distribution System) i.e. Amadeus, Galileo, Sabre, and Worldspan in the tourism industry can increase the availability, frequency and speed of communication among the companies in providing services to potential tourists. This research is to investigate the factors that influence the actual use of GDS in the tourism industry especially travel agents, airlines and hotels in Bali. This research employed a mixed method of quantitative and qualitative approaches. Field surveys were conducted and 80 valid questionnaires were received and analyzed by using SPSS 17.0; descriptive, correlation, factor analysis and regression tests were conducted. The variables used are Perceived Ease of Use and Perceived Usefulness (Technology Acceptance Model); Awareness, Perceived Risk and Communication Channels are examined. This research revealed that Perceived Ease of Use, Perceived Usefulness, Awareness, and Communication Channels influence the Behavioural intention to use GDS, whereas Perceived Risk were found not significant influence the use of GDS. These findings enable travel agent, airline and hotel companies to make provision decision with respect to the actual use of GDS.

  17. Spliceosome Profiling Visualizes Operations of a Dynamic RNP at Nucleotide Resolution.

    PubMed

    Burke, Jordan E; Longhurst, Adam D; Merkurjev, Daria; Sales-Lee, Jade; Rao, Beiduo; Moresco, James J; Yates, John R; Li, Jingyi Jessica; Madhani, Hiten D

    2018-05-03

    Tools to understand how the spliceosome functions in vivo have lagged behind advances in the structural biology of the spliceosome. Here, methods are described to globally profile spliceosome-bound pre-mRNA, intermediates, and spliced mRNA at nucleotide resolution. These tools are applied to three yeast species that span 600 million years of evolution. The sensitivity of the approach enables the detection of canonical and non-canonical events, including interrupted, recursive, and nested splicing. This application of statistical modeling uncovers independent roles for the size and position of the intron and the number of introns per transcript in substrate progression through the two catalytic stages. These include species-specific inputs suggestive of spliceosome-transcriptome coevolution. Further investigations reveal the ATP-dependent discard of numerous endogenous substrates after spliceosome assembly in vivo and connect this discard to intron retention, a form of splicing regulation. Spliceosome profiling is a quantitative, generalizable global technology used to investigate an RNP central to eukaryotic gene expression. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Reduction of cognitive conflict and learning style impact towards student-teacher's misconception load

    NASA Astrophysics Data System (ADS)

    A'yun, Kurroti; Suyono, Poedjiastoeti, Sri; Bin-Tahir, Saidna Zulfiqar

    2017-08-01

    The most crucial issue in education is a misconception that is caused by the misconception of the students themselves. Therefore, this study provided the solution to improve the quality of teaching chemistry in the schools through the remediation of misconceptions to the chemistry teacher candidates. This study employed a mixed method approach using concurrent embedded designs where it tended more to the qualitative research, but it still relied on the quantitative research in the assessment of the learning impact. The results of this study were the students with higher levels of cognitive conflict still have high loads of misconceptions (MC), it possibly due to the type of students' learning styles that is the sequential-global balanced. To facilitate the cognitive conflict character and the learning style of sequential-global balanced, the researchers created an integrated worksheet conceptual change with peer learning (WCCPL). The peer learning undertaken in the last stages of conceptual change of WCCPL can increase the resistance of students' concept in a category of knowing the concept significantly, but it should be examined in an in-depth study related to the long-term memory.

  19. A note on the stability and discriminability of graph-based features for classification problems in digital pathology

    NASA Astrophysics Data System (ADS)

    Cruz-Roa, Angel; Xu, Jun; Madabhushi, Anant

    2015-01-01

    Nuclear architecture or the spatial arrangement of individual cancer nuclei on histopathology images has been shown to be associated with different grades and differential risk for a number of solid tumors such as breast, prostate, and oropharyngeal. Graph-based representations of individual nuclei (nuclei representing the graph nodes) allows for mining of quantitative metrics to describe tumor morphology. These graph features can be broadly categorized into global and local depending on the type of graph construction method. While a number of local graph (e.g. Cell Cluster Graphs) and global graph (e.g. Voronoi, Delaunay Triangulation, Minimum Spanning Tree) features have been shown to associated with cancer grade, risk, and outcome for different cancer types, the sensitivity of the preceding segmentation algorithms in identifying individual nuclei can have a significant bearing on the discriminability of the resultant features. This therefore begs the question as to which features while being discriminative of cancer grade and aggressiveness are also the most resilient to the segmentation errors. These properties are particularly desirable in the context of digital pathology images, where the method of slide preparation, staining, and type of nuclear segmentation algorithm employed can all dramatically affect the quality of the nuclear graphs and corresponding features. In this paper we evaluated the trade off between discriminability and stability of both global and local graph-based features in conjunction with a few different segmentation algorithms and in the context of two different histopathology image datasets of breast cancer from whole-slide images (WSI) and tissue microarrays (TMA). Specifically in this paper we investigate a few different performance measures including stability, discriminability and stability vs discriminability trade off, all of which are based on p-values from the Kruskal-Wallis one-way analysis of variance for local and global graph features. Apart from identifying the set of local and global features that satisfied the trade off between stability and discriminability, our most interesting finding was that a simple segmentation method was sufficient to identify the most discriminant features for invasive tumour detection in TMAs, whereas for tumour grading in WSI, the graph based features were more sensitive to the accuracy of the segmentation algorithm employed.

  20. Behavioral economics and empirical public policy.

    PubMed

    Hursh, Steven R; Roma, Peter G

    2013-01-01

    The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively different reinforcers as well as quantifying the choice relations between concurrently available reinforcers. The potential of the behavioral economic approach to inform public policy is illustrated with examples from basic research, pre-clinical behavioral pharmacology, and clinical drug abuse research as well as emerging applications to public transportation and social behavior. Behavioral Economics can serve as a broadly applicable conceptual, methodological, and analytical framework for the development and evaluation of empirical public policy. © Society for the Experimental Analysis of Behavior.

  1. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    NASA Astrophysics Data System (ADS)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  2. A Bibliometric Analysis of Climate Engineering Research

    NASA Astrophysics Data System (ADS)

    Belter, C. W.; Seidel, D. J.

    2013-12-01

    The past five years have seen a dramatic increase in the number of media and scientific publications on the topic of climate engineering, or geoengineering, and some scientists are increasingly calling for more research on climate engineering as a possible supplement to climate change mitigation and adaptation strategies. In this context, understanding the current state of climate engineering research can help inform policy discussions and guide future research directions. Bibliometric analysis - the quantitative analysis of publications - is particularly applicable to fields with large bodies of literature that are difficult to summarize by traditional review methods. The multidisciplinary nature of the published literature on climate engineering makes it an ideal candidate for bibliometric analysis. Publications on climate engineering are found to be relatively recent (more than half of all articles during 1988-2011 were published since 2008), include a higher than average percentage of non-research articles (30% compared with 8-15% in related scientific disciplines), and be predominately produced by countries located in the Northern Hemisphere and speaking English. The majority of this literature focuses on land-based methods of carbon sequestration, ocean iron fertilization, and solar radiation management and is produced with little collaboration among research groups. This study provides a summary of existing publications on climate engineering, a perspective on the scientific underpinnings of the global dialogue on climate engineering, and a baseline for quantitatively monitoring the development of climate engineering research in the future.

  3. World-size global markets lead to economic instability.

    PubMed

    Louzoun, Yoram; Solomon, Sorin; Goldenberg, Jacob; Mazursky, David

    2003-01-01

    Economic and cultural globalization is one of the most important processes humankind has been undergoing lately. This process is assumed to be leading the world into a wealthy society with a better life. However, the current trend of globalization is not unprecedented in human history, and has had some severe consequences in the past. By applying a quantitative analysis through a microscopic representation we show that globalization, besides being unfair (with respect to wealth distribution), is also unstable and potentially dangerous as one event may lead to a collapse of the system. It is proposed that the optimal solution in controlling the unwanted aspects and enhancing the advantageous ones lies in limiting competition to large subregions, rather than making it worldwide.

  4. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  5. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  6. Optical properties of volcanic ash: improving remote sensing observations.

    NASA Astrophysics Data System (ADS)

    Whelley, Patrick; Colarco, Peter; Aquila, Valentina; Krotkov, Nickolay; Bleacher, Jake; Garry, Brent; Young, Kelsey; Rocha Lima, Adriana; Martins, Vanderlei; Carn, Simon

    2016-04-01

    Many times each year explosive volcanic eruptions loft ash into the atmosphere. Global travel and trade rely on aircraft vulnerable to encounters with airborne ash. Volcanic ash advisory centers (VAACs) rely on dispersion forecasts and satellite data to issue timely warnings. To improve ash forecasts model developers and satellite data providers need realistic information about volcanic ash microphysical and optical properties. In anticipation of future large eruptions we can study smaller events to improve our remote sensing and modeling skills so when the next Pinatubo 1991 or larger eruption occurs, ash can confidently be tracked in a quantitative way. At distances >100km from their sources, drifting ash plumes, often above meteorological clouds, are not easily detected from conventional remote sensing platforms, save deriving their quantitative characteristics, such as mass density. Quantitative interpretation of these observations depends on a priori knowledge of the spectral optical properties of the ash in UV (>0.3μm) and TIR wavelengths (>10μm). Incorrect assumptions about the optical properties result in large errors in inferred column mass loading and size distribution, which misguide operational ash forecasts. Similarly, simulating ash properties in global climate models also requires some knowledge of optical properties to improve aerosol speciation.

  7. Reconstructing Student Conceptions of Climate Change; An Inquiry Approach

    NASA Astrophysics Data System (ADS)

    McClelland, J. Collin

    No other environmental issue today has as much potential to alter life on Earth as does global climate change. Scientific evidence continues to grow; indicating that climate change is occurring now, and that change is a result of human activities (National Research Council [NRC], 2010). The need for climate literacy in society has become increasingly urgent. Unfortunately, understanding the concepts necessary for climate literacy remains a challenge for most individuals. A growing research base has identified a number of common misconceptions people have about climate literacy concepts (Leiserowitz, Smith, & Marlon 2011; Shepardson, Niyogi, Choi, & Charusombat, 2009). However, few have explored this understanding in high school students. This sequential mixed methods study explored the changing conceptions of global climate change in 90 sophomore biology students through the course of their participation in an eight-week inquiry-based global climate change unit. The study also explored changes in students' attitudes over the course of the study unit, contemplating possible relationships between students' conceptual understanding of and attitudes toward global climate change. Phase I of the mixed methods study included quantitative analysis of pre-post content knowledge and attitude assessment data. Content knowledge gains were statistically significant and over 25% of students in the study shifted from an expressed belief of denial or uncertainty about global warming to one of belief in it. Phase II used an inductive approach to explore student attitudes and conceptions. Conceptually, very few students grew to a scientifically accurate understanding of the greenhouse effect or the relationship between global warming and climate change. However, they generally made progress in their conceptual understanding by adding more specific detail to explain their understanding. Phase III employed a case study approach with eight purposefully selected student cases, identifying five common conceptual and five common attitudebased themes. Findings suggest similar misconceptions revealed in prior research also occurred in this study group. Some examples include; connecting global warming to the hole in the ozone layer, and falsely linking unrelated environmental issues like littering to climate change. Data about students' conceptual understanding of energy may also have implications for education research curriculum development. Similar to Driver & While no statistical relationship between students' attitudes about global climate change and overall conceptual understanding emerged, some data suggested that climate change skeptics may perceive the concept of evidence differently than non-skeptics. One-way ANOVA data comparing skeptics with other students on evidence-based assessment items was significant. This study offers insights to teachers of potential barriers students face when trying to conceptualize global climate change concepts. More importantly it reinforces the idea that students generally find value in learning about global climate change in the classroom.

  8. Global Albedo

    Atmospheric Science Data Center

    2013-04-19

    ... reflected by the Earth's surface at various wavelengths. A quantitative measure of this reflected sunlight is described by the albedo, ... parts of the planet, and for monthly as well as seasonal time increments. These and other surface and vegetation products from the MISR ...

  9. Lipid Informed Quantitation and Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Crowell, PNNL

    2014-07-21

    LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less

  10. Quantification of local and global benefits from air pollution control in Mexico City.

    PubMed

    Mckinley, Galen; Zuk, Miriam; Höjer, Morten; Avalos, Montserrat; González, Isabel; Iniestra, Rodolfo; Laguna, Israel; Martínez, Miguel A; Osnaya, Patricia; Reynales, Luz M; Valdés, Raydel; Martínez, Julia

    2005-04-01

    Complex sociopolitical, economic, and geographical realities cause the 20 million residents of Mexico City to suffer from some of the worst air pollution conditions in the world. Greenhouse gas emissions from the city are also substantial, and opportunities for joint local-global air pollution control are being sought. Although a plethora of measures to improve local air quality and reduce greenhouse gas emissions have been proposed for Mexico City, resources are not available for implementation of all proposed controls and thus prioritization must occur. Yet policy makers often do not conduct comprehensive quantitative analyses to inform these decisions. We reanalyze a subset of currently proposed control measures, and derive cost and health benefit estimates that are directly comparable. This study illustrates that improved quantitative analysis can change implementation prioritization for air pollution and greenhouse gas control measures in Mexico City.

  11. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels.

    PubMed

    Arnold, Benjamin F; van der Laan, Mark J; Hubbard, Alan E; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L; Moss, Delynn M; Nutman, Thomas B; Priest, Jeffrey W; Lammie, Patrick J

    2017-05-01

    Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman's rho = 0.75). In both high- and low transmission settings, mean antibody curves revealed changes in population mean antibody levels that were masked by seroprevalence measures because changes took place above or below the seropositivity cutoff. Age-dependent antibody curves and summary means provided a robust and sensitive measure of changes in transmission, with greatest sensitivity among young children. The method generalizes to pathogens that can be measured in high-throughput, multiplex serological assays, and scales to surveillance activities that require high spatiotemporal resolution. Our results suggest quantitative antibody levels will be particularly useful to measure differences in exposure for pathogens that elicit a transient antibody response or for monitoring populations with very high- or very low transmission, when seroprevalence is less informative. The approach represents a new opportunity to conduct integrated serological surveillance for neglected tropical diseases, malaria, and other infectious diseases with well-defined antigen targets.

  12. Evaluation of background parenchymal enhancement on breast MRI: a systematic review

    PubMed Central

    Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto

    2017-01-01

    Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480

  13. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  14. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  15. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  16. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  17. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  18. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  19. Challenges and Opportunities for Integrating Social Science Perspectives into Climate and Global Change Assessments

    NASA Astrophysics Data System (ADS)

    Larson, E. K.; Li, J.; Zycherman, A.

    2017-12-01

    Integration of social science into climate and global change assessments is fundamental for improving understanding of the drivers, impacts and vulnerability of climate change, and the social, cultural and behavioral challenges related to climate change responses. This requires disciplinary and interdisciplinary knowledge as well as integrational and translational tools for linking this knowledge with the natural and physical sciences. The USGCRP's Social Science Coordinating Committee (SSCC) is tasked with this challenge and is working to integrate relevant social, economic and behavioral knowledge into processes like sustained assessments. This presentation will discuss outcomes from a recent SSCC workshop, "Social Science Perspectives on Climate Change" and their applications to sustained assessments. The workshop brought academic social scientists from four disciplines - anthropology, sociology, geography and archaeology - together with federal scientists and program managers to discuss three major research areas relevant to the USGCRP and climate assessments: (1) innovative tools, methods, and analyses to clarify the interactions of human and natural systems under climate change, (2) understanding of factors contributing to differences in social vulnerability between and within communities under climate change, and (3) social science perspectives on drivers of global climate change. These disciplines, collectively, emphasize the need to consider socio-cultural, political, economic, geographic, and historic factors, and their dynamic interactions, to understand climate change drivers, social vulnerability, and mitigation and adaptation responses. They also highlight the importance of mixed quantitative and qualitative methods to explain impacts, vulnerability, and responses at different time and spatial scales. This presentation will focus on major contributions of the social sciences to climate and global change research. We will discuss future directions for sustained assessments that integrate and reflect the social science understanding of the complex relationships between social and natural worlds in a changing climate, and factors that impact effective mitigation and adaptation strategies that address risks and vulnerabilities of climate change.

  20. Global patterns and impacts of El Niño events on coral reefs: A meta-analysis.

    PubMed

    Claar, Danielle C; Szostek, Lisa; McDevitt-Irwin, Jamie M; Schanze, Julian J; Baum, Julia K

    2018-01-01

    Impacts of global climate change on coral reefs are being amplified by pulse heat stress events, including El Niño, the warm phase of the El Niño Southern Oscillation (ENSO). Despite reports of extensive coral bleaching and up to 97% coral mortality induced by El Niño events, a quantitative synthesis of the nature, intensity, and drivers of El Niño and La Niña impacts on corals is lacking. Herein, we first present a global meta-analysis of studies quantifying the effects of El Niño/La Niña-warming on corals, surveying studies from both the primary literature and International Coral Reef Symposium (ICRS) Proceedings. Overall, the strongest signal for El Niño/La Niña-associated coral bleaching was long-term mean temperature; bleaching decreased with decreasing long-term mean temperature (n = 20 studies). Additionally, coral cover losses during El Niño/La Niña were shaped by localized maximum heat stress and long-term mean temperature (n = 28 studies). Second, we present a method for quantifying coral heat stress which, for any coral reef location in the world, allows extraction of remotely-sensed degree heating weeks (DHW) for any date (since 1982), quantification of the maximum DHW, and the time lag since the maximum DHW. Using this method, we show that the 2015/16 El Niño event instigated unprecedented global coral heat stress across the world's oceans. With El Niño events expected to increase in frequency and severity this century, it is imperative that we gain a clear understanding of how these thermal stress anomalies impact different coral species and coral reef regions. We therefore finish with recommendations for future coral bleaching studies that will foster improved syntheses, as well as predictive and adaptive capacity to extreme warming events.

  1. Global patterns and impacts of El Niño events on coral reefs: A meta-analysis

    PubMed Central

    Szostek, Lisa; McDevitt-Irwin, Jamie M.; Schanze, Julian J.; Baum, Julia K.

    2018-01-01

    Impacts of global climate change on coral reefs are being amplified by pulse heat stress events, including El Niño, the warm phase of the El Niño Southern Oscillation (ENSO). Despite reports of extensive coral bleaching and up to 97% coral mortality induced by El Niño events, a quantitative synthesis of the nature, intensity, and drivers of El Niño and La Niña impacts on corals is lacking. Herein, we first present a global meta-analysis of studies quantifying the effects of El Niño/La Niña-warming on corals, surveying studies from both the primary literature and International Coral Reef Symposium (ICRS) Proceedings. Overall, the strongest signal for El Niño/La Niña-associated coral bleaching was long-term mean temperature; bleaching decreased with decreasing long-term mean temperature (n = 20 studies). Additionally, coral cover losses during El Niño/La Niña were shaped by localized maximum heat stress and long-term mean temperature (n = 28 studies). Second, we present a method for quantifying coral heat stress which, for any coral reef location in the world, allows extraction of remotely-sensed degree heating weeks (DHW) for any date (since 1982), quantification of the maximum DHW, and the time lag since the maximum DHW. Using this method, we show that the 2015/16 El Niño event instigated unprecedented global coral heat stress across the world's oceans. With El Niño events expected to increase in frequency and severity this century, it is imperative that we gain a clear understanding of how these thermal stress anomalies impact different coral species and coral reef regions. We therefore finish with recommendations for future coral bleaching studies that will foster improved syntheses, as well as predictive and adaptive capacity to extreme warming events. PMID:29401493

  2. Partial volume correction and image segmentation for accurate measurement of standardized uptake value of grey matter in the brain.

    PubMed

    Bural, Gonca; Torigian, Drew; Basu, Sandip; Houseni, Mohamed; Zhuge, Ying; Rubello, Domenico; Udupa, Jayaram; Alavi, Abass

    2015-12-01

    Our aim was to explore a novel quantitative method [based upon an MRI-based image segmentation that allows actual calculation of grey matter, white matter and cerebrospinal fluid (CSF) volumes] for overcoming the difficulties associated with conventional techniques for measuring actual metabolic activity of the grey matter. We included four patients with normal brain MRI and fluorine-18 fluorodeoxyglucose (F-FDG)-PET scans (two women and two men; mean age 46±14 years) in this analysis. The time interval between the two scans was 0-180 days. We calculated the volumes of grey matter, white matter and CSF by using a novel segmentation technique applied to the MRI images. We measured the mean standardized uptake value (SUV) representing the whole metabolic activity of the brain from the F-FDG-PET images. We also calculated the white matter SUV from the upper transaxial slices (centrum semiovale) of the F-FDG-PET images. The whole brain volume was calculated by summing up the volumes of the white matter, grey matter and CSF. The global cerebral metabolic activity was calculated by multiplying the mean SUV with total brain volume. The whole brain white matter metabolic activity was calculated by multiplying the mean SUV for the white matter by the white matter volume. The global cerebral metabolic activity only reflects those of the grey matter and the white matter, whereas that of the CSF is zero. We subtracted the global white matter metabolic activity from that of the whole brain, resulting in the global grey matter metabolism alone. We then divided the grey matter global metabolic activity by grey matter volume to accurately calculate the SUV for the grey matter alone. The brain volumes ranged between 1546 and 1924 ml. The mean SUV for total brain was 4.8-7. Total metabolic burden of the brain ranged from 5565 to 9617. The mean SUV for white matter was 2.8-4.1. On the basis of these measurements we generated the grey matter SUV, which ranged from 8.1 to 11.3. The accurate metabolic activity of the grey matter can be calculated using the novel segmentation technique that we applied to MRI. By combining these quantitative data with those generated from F-FDG-PET images we were able to calculate the accurate metabolic activity of the grey matter. These types of measurements will be of great value in accurate analysis of the data from patients with neuropsychiatric disorders.

  3. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  4. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  5. [Simultaneous quantitative analysis of five alkaloids in Sophora flavescens by multi-components assay by single marker].

    PubMed

    Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang

    2013-05-01

    To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.

  6. Sexual Violence against Men Who Have Sex with Men and Transgender Women in Mongolia: A Mixed-Methods Study of Scope and Consequences.

    PubMed

    Peitzmeier, Sarah M; Yasin, Faiza; Stephenson, Rob; Wirtz, Andrea L; Delegchoimbol, Altanchimeg; Dorjgotov, Myagmardorj; Baral, Stefan

    2015-01-01

    The role of sexual violence in health and human rights-related outcomes, including HIV, is receiving increasing attention globally, yet the prevalence, patterns, and correlates of sexual violence have been little-studied among men who have sex with men (MSM) and transgender women in low and middle income countries. A mixed-methods study with quantitative and qualitative phases was conducted among MSM and transgender women in Ulaanbaatar, Mongolia. Methods included respondent-driven sampling (RDS) with structured socio-behavioral surveys (N = 313) as well as qualitative methods including 30 in-depth interviews and 2 focus group discussions. Forced sex in the last three years was reported by 14.7% of respondents (RDS-weighted estimate, 95%CI: 9.4-20.1; crude estimate 16.1%, 49/307) in the quantitative phase. A descriptive typology of common scenarios was constructed based on the specific incidents of sexual violence shared by respondents in the qualitative phase (37 incidents across 28 interviews and 2 focus groups). Eight major types of sexual violence were identified, most frequent of which were bias-motivated street violence and alcohol-involved party-related violence. Many vulnerabilities to and consequences of sexual violence described during the qualitative phase were also independently associated with forced sex, including alcohol use at least once per week (AOR = 3.39, 95% CI:1.69-6.81), and having received payment for sex (AOR = 2.77, 95% CI:1.14-6.75). Building on the promising strategies used in other settings to prevent and respond to sexual violence, similar strengthening of legal and social sector responses may provide much needed support to survivors and prevent future sexual violence.

  7. Sexual Violence against Men Who Have Sex with Men and Transgender Women in Mongolia: A Mixed-Methods Study of Scope and Consequences

    PubMed Central

    Peitzmeier, Sarah M.; Yasin, Faiza; Stephenson, Rob; Wirtz, Andrea L.; Delegchoimbol, Altanchimeg; Dorjgotov, Myagmardorj; Baral, Stefan

    2015-01-01

    The role of sexual violence in health and human rights-related outcomes, including HIV, is receiving increasing attention globally, yet the prevalence, patterns, and correlates of sexual violence have been little-studied among men who have sex with men (MSM) and transgender women in low and middle income countries. A mixed-methods study with quantitative and qualitative phases was conducted among MSM and transgender women in Ulaanbaatar, Mongolia. Methods included respondent-driven sampling (RDS) with structured socio-behavioral surveys (N = 313) as well as qualitative methods including 30 in-depth interviews and 2 focus group discussions. Forced sex in the last three years was reported by 14.7% of respondents (RDS-weighted estimate, 95%CI: 9.4–20.1; crude estimate 16.1%, 49/307) in the quantitative phase. A descriptive typology of common scenarios was constructed based on the specific incidents of sexual violence shared by respondents in the qualitative phase (37 incidents across 28 interviews and 2 focus groups). Eight major types of sexual violence were identified, most frequent of which were bias-motivated street violence and alcohol-involved party-related violence. Many vulnerabilities to and consequences of sexual violence described during the qualitative phase were also independently associated with forced sex, including alcohol use at least once per week (AOR = 3.39, 95% CI:1.69–6.81), and having received payment for sex (AOR = 2.77, 95% CI:1.14–6.75). Building on the promising strategies used in other settings to prevent and respond to sexual violence, similar strengthening of legal and social sector responses may provide much needed support to survivors and prevent future sexual violence. PMID:26431311

  8. The feasibility of community level interventions for pre-eclampsia in South Asia and Sub-Saharan Africa: a mixed-methods design.

    PubMed

    Khowaja, Asif Raza; Qureshi, Rahat Najam; Sawchuck, Diane; Oladapo, Olufemi T; Adetoro, Olalekan O; Orenuga, Elizabeth A; Bellad, Mrutyunjaya; Mallapur, Ashalata; Charantimath, Umesh; Sevene, Esperança; Munguambe, Khátia; Boene, Helena Edith; Vidler, Marianne; Bhutta, Zulfiqar A; von Dadelszen, Peter

    2016-06-08

    Globally, pre-eclampsia and eclampsia are major contributors to maternal and perinatal mortality; of which the vast majority of deaths occur in less developed countries. In addition, a disproportionate number of morbidities and mortalities occur due to delayed access to health services. The Community Level Interventions for Pre-eclampsia (CLIP) Trial aims to task-shift to community health workers the identification and emergency management of pre-eclampsia and eclampsia to improve access and timely care. Literature revealed paucity of published feasibility assessments prior to initiating large-scale community-based interventions. Arguably, well-conducted feasibility studies can provide valuable information about the potential success of clinical trials prior to implementation. Failure to fully understand the study context risks the effective implementation of the intervention and limits the likelihood of post-trial scale-up. Therefore, it was imperative to conduct community-level feasibility assessments for a trial of this magnitude. A mixed methods design guided by normalization process theory was used for this study in Nigeria, Mozambique, Pakistan, and India to explore enabling and impeding factors for the CLIP Trial implementation. Qualitative data were collected through participant observation, document review, focus group discussion and in-depth interviews with diverse groups of community members, key informants at community level, healthcare providers, and policy makers. Quantitative data were collected through health facility assessments, self-administered community health worker surveys, and household demographic and health surveillance. Refer to CLIP Trial feasibility publications in the current and/or forthcoming supplement. Feasibility assessments for community level interventions, particularly those involving task-shifting across diverse regions, require an appropriate theoretical framework and careful selection of research methods. The use of qualitative and quantitative methods increased the data richness to better understand the community contexts. NCT01911494.

  9. How to perform Subjective Global Nutritional assessment in children.

    PubMed

    Secker, Donna J; Jeejeebhoy, Khursheed N

    2012-03-01

    Subjective Global Assessment (SGA) is a method for evaluating nutritional status based on a practitioner's clinical judgment rather than objective, quantitative measurements. Encompassing historical, symptomatic, and physical parameters, SGA aims to identify an individual's initial nutrition state and consider the interplay of factors influencing the progression or regression of nutrition abnormalities. SGA has been widely used for more than 25 years to assess the nutritional status of adults in both clinical and research settings. Perceiving multiple benefits of its use in children, we recently adapted and validated the SGA tool for use in a pediatric population, demonstrating its ability to identify the nutritional status of children undergoing surgery and their risk of developing nutrition-associated complications postoperatively. Objective measures of nutritional status, on the other hand, showed no association with outcomes. The purpose of this article is to describe in detail the methods used in conducting nutrition-focused physical examinations and the medical history components of a pediatric Subjective Global Nutritional Assessment tool. Guidelines are given for performing and interpreting physical examinations that look for evidence of loss of subcutaneous fat, muscle wasting, and/or edema in children of different ages. Age-related questionnaires are offered to guide history taking and the rating of growth, weight changes, dietary intake, gastrointestinal symptoms, functional capacity, and any metabolic stress. Finally, the associated rating form is provided, along with direction for how to consider all components of a physical exam and history in the context of each other, to assign an overall rating of normal/well nourished, moderate malnutrition, or severe malnutrition. With this information, interested health professionals will be able to perform Subjective Global Nutritional Assessment to determine a global rating of nutritional status for infants, children, and adolescents, and use this rating to guide decision making about what nutrition-related attention is necessary. Dietetics practitioners and other clinicians are encouraged to incorporate physical examination for signs of protein-energy depletion when assessing the nutritional status of children. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  10. Review of life-cycle approaches coupled with data envelopment analysis: launching the CFP + DEA method for energy policy making.

    PubMed

    Vázquez-Rowe, Ian; Iribarren, Diego

    2015-01-01

    Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.

  11. Review of Life-Cycle Approaches Coupled with Data Envelopment Analysis: Launching the CFP + DEA Method for Energy Policy Making

    PubMed Central

    Vázquez-Rowe, Ian

    2015-01-01

    Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136

  12. Challenges for modeling global gene regulatory networks during development: insights from Drosophila.

    PubMed

    Wilczynski, Bartek; Furlong, Eileen E M

    2010-04-15

    Development is regulated by dynamic patterns of gene expression, which are orchestrated through the action of complex gene regulatory networks (GRNs). Substantial progress has been made in modeling transcriptional regulation in recent years, including qualitative "coarse-grain" models operating at the gene level to very "fine-grain" quantitative models operating at the biophysical "transcription factor-DNA level". Recent advances in genome-wide studies have revealed an enormous increase in the size and complexity or GRNs. Even relatively simple developmental processes can involve hundreds of regulatory molecules, with extensive interconnectivity and cooperative regulation. This leads to an explosion in the number of regulatory functions, effectively impeding Boolean-based qualitative modeling approaches. At the same time, the lack of information on the biophysical properties for the majority of transcription factors within a global network restricts quantitative approaches. In this review, we explore the current challenges in moving from modeling medium scale well-characterized networks to more poorly characterized global networks. We suggest to integrate coarse- and find-grain approaches to model gene regulatory networks in cis. We focus on two very well-studied examples from Drosophila, which likely represent typical developmental regulatory modules across metazoans. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  13. Role of platinum DNA damage-induced transcriptional inhibition in chemotherapy-induced neuronal atrophy and peripheral neurotoxicity.

    PubMed

    Yan, Fang; Liu, Johnson J; Ip, Virginia; Jamieson, Stephen M F; McKeage, Mark J

    2015-12-01

    Platinum-based anticancer drugs cause peripheral neurotoxicity by damaging sensory neurons within the dorsal root ganglia (DRG), but the mechanisms are incompletely understood. The roles of platinum DNA binding, transcription inhibition and altered cell size were investigated in primary cultures of rat DRG cells. Click chemistry quantitative fluorescence imaging of RNA-incorporated 5-ethynyluridine showed high, but wide ranging, global levels of transcription in individual neurons that correlated with their cell body size. Treatment with platinum drugs reduced neuronal transcription and cell body size to an extent that corresponded to the amount of preceding platinum DNA binding, but without any loss of neuronal cells. The effects of platinum drugs on neuronal transcription and cell body size were inhibited by blocking platinum DNA binding with sodium thiosulfate, and mimicked by treatment with a model transcriptional inhibitor, actinomycin D. In vivo oxaliplatin treatment depleted the total RNA content of DRG tissue concurrently with altering DRG neuronal size. These findings point to a mechanism of chemotherapy-induced peripheral neurotoxicity, whereby platinum DNA damage induces global transcriptional arrest leading in turn to neuronal atrophy. DRG neurons may be particularly vulnerable to this mechanism of toxicity because of their requirements for high basal levels of global transcriptional activity. Findings point to a new stepwise mechanism of chemotherapy-induced peripheral neurotoxicity, whereby platinum DNA damage induces global transcriptional arrest leading in turn to neuronal atrophy. Dorsal root ganglion neurons may be particularly vulnerable to this neurotoxicity because of their high global transcriptional outputs, demonstrated in this study by click chemistry quantitative fluorescence imaging. © 2015 International Society for Neurochemistry.

  14. Validation of PCR methods for quantitation of genetically modified plants in food.

    PubMed

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  15. Combining Multidisciplinary Science, Quantitative Reasoning and Social Context to Teach Global Sustainability and Prepare Students for 21st Grand Challenges

    NASA Astrophysics Data System (ADS)

    Myers, J. D.

    2011-12-01

    The Earth's seven billion humans are consuming a growing proportion of the world's ecosystem products and services. Human activity has also wrought changes that rival the scale of many natural geologic processes, e.g. erosion, transport and deposition, leading to recognition of a new geological epoch, the Anthropocene. Because of these impacts, several natural systems have been pushed beyond the planetary boundaries that made the Holocene favorable for the expansion of humanity. Given these human-induced stresses on natural systems, global citizens will face an increasing number of grand challenges. Unfortunately, traditional discipline-based introductory science courses do little to prepare students for these complex, scientifically-based and technologically-centered challenges. With NSF funding, an introductory, integrated science course stressing quantitative reasoning and social context has been created at UW. The course (GEOL1600: Global Sustainability: Managing the Earth's Resources) is a lower division course designed around the energy-water-climate (EWC) nexus and integrating biology, chemistry, Earth science and physics. It melds lectures, lecture activities, reading questionnaires and labs to create a learning environment that examines the EWT nexus from a global through regional context. The focus on the EWC nexus, while important socially and intended to motivate students, also provides a coherent framework for identifying which disciplinary scientific principles and concepts to include in the course: photosynthesis and deep time (fossil fuels), biogeochemical cycles (climate), chemical reactions (combustion), electromagnetic radiation (solar power), nuclear physics (nuclear power), phase changes and diagrams (water and climate), etc. Lecture activities are used to give students the practice they need to make quantitative skills routine and automatic. Laboratory exercises on energy (coal, petroleum, nuclear power), water (in Bangladesh), energy production and water (shale gas hydrofracing and oil sand production) and climate (scientific modeling, carbon emission management) address EWC issues in international, national and regional contexts while reflecting the news headlines of the day.

  16. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Learning from developing countries in strengthening health systems: an evaluation of personal and professional impact among global health volunteers at Addis Ababa University's Tikur Anbessa Specialized Hospital (Ethiopia).

    PubMed

    Busse, Heidi; Aboneh, Ephrem A; Tefera, Girma

    2014-09-05

    The positive impact of global health activities by volunteers from the United States in low-and middle-income countries has been recognized. Most existing global health partnerships evaluate what knowledge, ideas, and activities the US institution transferred to the low- or middle-income country. However, what this fails to capture are what kinds of change happen to US-based partners due to engagement in global health partnerships, both at the individual and institutional levels. "Reverse innovation" is the term that is used in global health literature to describe this type of impact. The objectives of this study were to identify what kinds of impact global partnerships have on health volunteers from developed countries, advance this emerging body of knowledge, and improve understanding of methods and indicators for assessing reverse innovation. The study population consisted of 80 US, Canada, and South Africa-based health care professionals who volunteered at Tikur Anbessa Specialized Hospital in Ethiopia. Surveys were web-based and included multiple choice and open-ended questions to assess global health competencies. The data were analyzed using IBRM SPSS® version 21 for quantitative analysis; the open-ended responses were coded using constant comparative analysis to identify themes. Of the 80 volunteers, 63 responded (79 percent response rate). Fifty-two percent of the respondents were male, and over 60 percent were 40 years of age and older. Eighty-three percent reported they accomplished their trip objectives, 95 percent would participate in future activities and 96 percent would recommend participation to other colleagues. Eighty-nine percent reported personal impact and 73 percent reported change on their professional development. Previous global health experience, multiple prior trips, and the desire for career advancement were associated with positive impact on professional development. Professionally and personally meaningful learning happens often during global health outreach. Understanding this impact has important policy, economic, and programmatic implications. With the aid of improved monitoring and evaluation frameworks, the simple act of attempting to measure "reverse innovation" may represent a shift in how global health partnerships are perceived, drawing attention to the two-way learning and benefits that occur and improving effectiveness in global health partnership spending.

  18. MONITORING ECOSYSTEMS FROM SPACE: THE GLOBAL FIDUCIALS PROGRAM

    EPA Science Inventory

    Images from satellites provide valuable insights to changes in land-cover and ecosystems. Long- term monitoring of ecosystem change using historical satellite imagery can provide quantitative measures of ecological processes and allows for estimation of future ecosystem condition...

  19. Integrating real-time GIS and social media for qualitative transportation data collection.

    DOT National Transportation Integrated Search

    2016-12-26

    New technologies such as global positioning system, smartphone, and social media are changing the way we move around. Traditional : transportation research has overwhelmingly emphasized the collection of quantitative data for modeling, without much c...

  20. Embedding Quantitative Methods by Stealth in Political Science: Developing a Pedagogy for Psephology

    ERIC Educational Resources Information Center

    Gunn, Andrew

    2017-01-01

    Student evaluations of quantitative methods courses in political science often reveal they are characterised by aversion, alienation and anxiety. As a solution to this problem, this paper describes a pedagogic research project with the aim of embedding quantitative methods by stealth into the first-year undergraduate curriculum. This paper…

Top