Sample records for comprehensive quantitative evaluation

  1. An Evaluation of Two Approaches for Teaching Reading Comprehension Strategies in the Primary Years Using Science Information Texts

    ERIC Educational Resources Information Center

    Reutzel, D. Ray; Smith, John A.; Fawson, Parker C.

    2005-01-01

    There are few research studies on the effects of teaching comprehension strategies to young children in the primary grades. Using a Dominant-Less Dominant Mixed Model design employing both qualitative and quantitative data collection, we evaluated two approaches for teaching comprehension strategies to 7- and 8-year-old children in four…

  2. Models Matter--The Final Report of the National Longitudinal Evaluation of Comprehensive School Reform

    ERIC Educational Resources Information Center

    Aladjem, Daniel K.; LeFloch, Kerstin Carlson; Zhang, Yu; Kurki, Anja; Boyle, Andrea; Taylor, James E.; Herrmann, Suzannah; Uekawa, Kazuaki; Thomsen, Kerri; Fashola, Olatokunbo

    2006-01-01

    The National Longitudinal Evaluation of Comprehensive School Reform (NLECSR) is a quantitative and qualitative study of behavior, decisions, processes, and outcomes. It employs a quasi-experimental design with matched treatment and comparison schools. NLECSR seeks to determine the effects of CSR models on student achievement in about 650…

  3. [Correspondence analysis between traditional commercial specifications and quantitative quality indices of Notopterygii Rhizoma et Radix].

    PubMed

    Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi

    2016-03-01

    This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.

  4. Making a Game out of It: Using Web-Based Competitive Quizzes for Quantitative Analysis Content Review

    ERIC Educational Resources Information Center

    Grinias, James P.

    2017-01-01

    Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…

  5. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  6. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  7. Evaluating comprehensiveness in children's healthcare.

    PubMed

    Diniz, Suênia Gonçalves de Medeiros; Damasceno, Simone Soares; Coutinho, Simone Elizabeth Duarte; Toso, Beatriz Rosana Gonçalves de Oliveira; Collet, Neusa

    2016-12-15

    To evaluate the presence and extent of comprehensiveness in children's healthcare in the context of the Family Health Strategy. Evaluative, quantitative, cross-sectional study conducted with 344 family members of children at the Family Health Units of João Pessoa, PB, Brazil. Data were collected using the PCATool Brazil - child version and analysed according to descriptive and exploratory statistics. The attribute of comprehensiveness did not obtain satisfactory scores in the two evaluated dimensions, namely "available services" and "provided services". The low scores reveal that the attribute comprehensiveness is not employed as expected in a primary care unit and points to the issues that must be altered. It was concluded that the services should be restructured to ensure cross-sector performance in the provision of child care. It is also important to improve the relations between professionals and users to promote comprehensive and effective care.

  8. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  9. An economic analysis methodology for project evaluation and programming.

    DOT National Transportation Integrated Search

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  10. The Evaluation of Effectiveness of Reciprocal Teaching Strategies on Comprehension of Expository Texts

    ERIC Educational Resources Information Center

    Pilten, Gulhiz

    2016-01-01

    The purpose of the present research is investigating the effects of reciprocal teaching in comprehending expository texts. The research was designed with mixed method. The quantitative dimension of the present research was designed in accordance with pre-test-post-test control group experiment model. The quantitative dimension of the present…

  11. Evaluating the Impact of an Urban Comprehensive School Reform: An Illustration of the Need for Mixed Methods

    ERIC Educational Resources Information Center

    Sondergeld, Toni A.; Koskey, Kristin L. K.

    2011-01-01

    An abundance of comprehensive school reform (CSR) literature exists illustrating CSRs are effective in improving student outcomes. However, much of this research reports on top-down reforms, focuses on academic outcomes, and uses quantitative methods alone. Many educational researchers have argued for the use of mixed methods for providing a…

  12. Evaluating Comprehensive School Reform Models at Scale: Focus on Implementation

    ERIC Educational Resources Information Center

    Vernez, Georges; Karam, Rita; Mariano, Louis T.; DeMartini, Christine

    2006-01-01

    This study was designed to fill the "implementation measurement" gap. A methodology to quantitatively measure the level of Comprehensive School Reform (CSR) implementation that can be used across a variety of CSR models was developed, and then applied to measure actual implementation of four different CSR models in a large number of schools. The…

  13. Investigation and comprehensive evaluation of the litter pollution on the Heishijiao beach in Dalian

    NASA Astrophysics Data System (ADS)

    Han, Mengdi; Zhao, Kaiyuan; Zhang, Yan; Sui, Chuanguo

    2018-02-01

    From November 2015 to August 2016, this paper conducted an investigation into the classification of the litter on the Heishijiao beach in Dalian, and made a comprehensive evaluation of the litter pollution on the beach in different seasons. According to the results, the litter on the Heishijiao beach in Dalian mainly come from human’s offshore activities and other wastes, and spring is the season which witnesses the largest quantity of litter resulting from the activities. Most of the fragmental wastes are glass, plastic and paper, while there is a little metal, rubber and wooden products. On the Heishijiao beach, most of the fragmental litter are small, followed by medium and large ones; outsized wastes are rare. The quantitative density of litter is highest in winter (9.0items/m2), with the average quantitative density of 4.6 items/m2; the qualitative density of litter is highest in spring (8 g/m2), with the average qualitative density of 6.0 g/m2. The results of the comprehensive evaluation show that the litter pollution on the Heishijiao beach stays between “Average” and “Unsatisfactory”.

  14. CASTIN: a system for comprehensive analysis of cancer-stromal interactome.

    PubMed

    Komura, Daisuke; Isagawa, Takayuki; Kishi, Kazuki; Suzuki, Ryohei; Sato, Reiko; Tanaka, Mariko; Katoh, Hiroto; Yamamoto, Shogo; Tatsuno, Kenji; Fukayama, Masashi; Aburatani, Hiroyuki; Ishikawa, Shumpei

    2016-11-09

    Cancer microenvironment plays a vital role in cancer development and progression, and cancer-stromal interactions have been recognized as important targets for cancer therapy. However, identifying relevant and druggable cancer-stromal interactions is challenging due to the lack of quantitative methods to analyze whole cancer-stromal interactome. We present CASTIN (CAncer-STromal INteractome analysis), a novel framework for the evaluation of cancer-stromal interactome from RNA-Seq data using cancer xenograft models. For each ligand-receptor interaction which is derived from curated protein-protein interaction database, CASTIN summarizes gene expression profiles of cancer and stroma into three evaluation indices. These indices provide quantitative evaluation and comprehensive visualization of interactome, and thus enable to identify critical cancer-microenvironment interactions, which would be potential drug targets. We applied CASTIN to the dataset of pancreas ductal adenocarcinoma, and successfully characterized the individual cancer in terms of cancer-stromal relationships, and identified both well-known and less-characterized druggable interactions. CASTIN provides comprehensive view of cancer-stromal interactome and is useful to identify critical interactions which may serve as potential drug targets in cancer-microenvironment. CASTIN is available at: http://github.com/tmd-gpat/CASTIN .

  15. Comprehensive evaluation of impacts of distributed generation integration in distribution network

    NASA Astrophysics Data System (ADS)

    Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu

    2018-04-01

    All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.

  16. Development of a quality instrument for assessing the spontaneous reports of ADR/ADE using Delphi method in China.

    PubMed

    Chen, Lixun; Jiang, Ling; Shen, Aizong; Wei, Wei

    2016-09-01

    The frequently low quality of submitted spontaneous reports is of an increasing concern; to our knowledge, no validated instrument exists for assessing case reports' quality comprehensively enough. This work was conducted to develop such a quality instrument for assessing the spontaneous reports of adverse drug reaction (ADR)/adverse drug event (ADE) in China. Initial evaluation indicators were generated using systematic and literature data analysis. Final indicators and their weights were identified using Delphi method. The final quality instrument was developed by adopting the synthetic scoring method. A consensus was reached after four rounds of Delphi survey. The developed quality instrument consisted of 6 first-rank indicators, 18 second-rank indicators, and 115 third-rank indicators, and each rank indicator has been weighted. It evaluates the quality of spontaneous reports of ADR/ADE comprehensively and quantitatively on six parameters: authenticity, duplication, regulatory, completeness, vigilance level, and reporting time frame. The developed instrument was tested with good reliability and validity, which can be used to comprehensively and quantitatively assess the submitted spontaneous reports of ADR/ADE in China.

  17. The Comprehensive Benefit Evaluation of Take Shared Bicycles as Connecting to Public Transit

    NASA Astrophysics Data System (ADS)

    Zhang, J. Y.; Sun, H.; Li, P. F.; Li, C. C.

    2017-10-01

    Shared bicycles as an important way of connecting public transport, have few literature to evaluate its effectiveness.This paper taking Beijing city as an example, make anevaluationfor the six types of travel combinations which are commonly used by the citizens. The author selects four quantitative indicators: fuel consumption, economic cost, total time spent, and CO2 emission. And two qualitative indicators: degree of comfort and convenience. The relative efficiency of quantitative indicators is obtained by data envelopment analysis (DEA) and fuzzification and then take fuzzy synthetic evaluation with qualitative indicators.It was found that the choice of shared bicycles +subway+ shared bicycles and shared bicycles has good comprehensive benefits in medium distance travel. The findings also suggest that shared bicycles +subway+ shared bicycles is the best choice in the utilitarian trips. The conclusions not only provide suggestions for the travellers to select travel modes, but also can adjust the relevant factors to increase the proportion of green travel.

  18. Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.

    2012-08-01

    With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  19. Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games

    NASA Astrophysics Data System (ADS)

    Zhongyou, Xing

    The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.

  20. Analysis of strategic plans to assess planning for sustainability of comprehensive community initiatives.

    PubMed

    Sridharan, Sanjeev; Go, Sodam; Zinzow, Heidi; Gray, Aracelis; Barrett, Melissa Gutierrez

    2007-02-01

    In order to achieve the intended impact on a community, comprehensive community initiatives must sustain programs once they have been implemented. However, planning for sustainability is challenging and is rarely incorporated in the planning process of an initiative. The current study examined 19 5-year plans developed during the planning phase of the Comprehensive Strategy for Serious, Violent and Chronic Juvenile Offenders. Quantitative and qualitative methods were employed to assess the extent to which the construct of sustainability was incorporated. The plan analysis was supplemented with results from other components of the complex evaluation design implemented as part of the process evaluation of Comprehensive Strategy. Results suggested that sustainability was not accounted for during the planning phase of this initiative. The implications of these findings, including the importance of planning for sustainability in order to achieve sustainability, are discussed.

  1. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  2. 75 FR 9488 - Basel Comprehensive Quantitative Impact Study

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-02

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Basel Comprehensive Quantitative Impact... Quantitative Impact Study. OMB Number: 1550-0NEW. Form Numbers: N/A. Regulation requirement: 12 CFR Part 567... Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to assess...

  3. A multi-method approach to evaluate health information systems.

    PubMed

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  4. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The effects of infographics and several quantitative versus qualitative formats for cardiovascular disease risk, including heart age, on people's risk understanding.

    PubMed

    Damman, Olga C; Vonk, Suzanne I; van den Haak, Maaike J; van Hooijdonk, Charlotte M J; Timmermans, Danielle R M

    2018-03-11

    To study how comprehension of cardiovascular disease (CVD) risk is influenced by: (1) infographics about qualitative risk information, with/without risk numbers; (2) which qualitative risk dimension is emphasized; (3) heart age vs. traditional risk format. For aim 1, a 2 (infographics versus text) x 2 (risk number versus no risk number) between-subjects design was used. For aim 2, three pieces of information were tested within-subjects. Aim 3 used a simple comparison group. Participants (45-65 yrs old) were recruited through an online access panel; low educated people were oversampled. They received hypothetical risk information (20%/61yrs). Primary outcomes: recall, risk appraisals, subjective/objective risk comprehension. behavioral intentions, information evaluations. Infographics of qualitative risk dimensions negatively affected recall, subjective risk comprehension and information evaluations. No effect of type of risk dimension was found on risk perception. Heart age influenced recall, comprehension, evaluations and affective risk appraisals. Infographics of hypothetical CVD risk information had detrimental effects on measures related to risk perception/comprehension, but effects were mainly seen in undereducated participants. Heart age influenced perceptions/comprehension of hypothetical risk in a way that seemed to support understanding. Heart age seems a fruitful risk communication approach in disease risk calculators. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. [Thinking about vertigo effectiveness evaluation methods in clinical research of Chinese medicine].

    PubMed

    Liu, Hong-mei; Li, Tao

    2014-10-01

    Vertigo is a kind of patients' subjective feelings. The severity of vertigo is closely related to many factors. But we are short of a well accepted quantitative evaluation method capable of accurately and comprehensively evaluating vertigo in clinics. Reducing the onset of vertigo, enhancing the re- covery of equilibrium function, and improving the quality of life of vertigo patients should be taken as the focus of evaluating therapeutic effects. As for establishing a Chinese medical effectiveness evaluation system for vertigo, we believe we should distinguish different "diseases". We could roughly identify it as systemic vertigo and non-systemic vertigo. For systemic vertigo, the efficacy of vertigo could be comprehensively evaluated by UCLA vertigo questionnaire or dizziness handicap inventory combined with equilibrium function testing indices. But for non-systemic vertigo, the efficacy of vertigo could be comprehensively evaluated by taking UCLA vertigo questionnaire or dizziness handicap inventory as main efficacy indices. Secondly, we should analyze different reasons for vertigo, choose symptoms and signs in line with vertigo features as well as with Chinese medical theories, and formulate corresponding syndrome effectiveness standards according to different diseases. We should not simply take syndrome diagnosis standards as efficacy evaluation standards.

  7. Primary Health Care Evaluation: the view of clients and professionals about the Family Health Strategy.

    PubMed

    da Silva, Simone Albino; Baitelo, Tamara Cristina; Fracolli, Lislaine Aparecida

    2015-01-01

    to evaluate the attributes of primary health care as for access; longitudinality; comprehensiveness; coordination; family counseling and community counseling in the Family Health Strategy, triangulating and comparing the views of stakeholders involved in the care process. evaluative research with a quantitative approach and cross-sectional design. Data collected using the Primary Care Assessment Tool for interviews with 527 adult clients, 34 health professionals, and 330 parents of children up to two years old, related to 33 family health teams, in eleven municipalities. Analysis conducted in the Statistical Package for Social Sciences software, with a confidence interval of 95% and error of 0.1. the three groups assessed the first contact access - accessibility with low scores. Professionals evaluated with a high score the other attributes. Clients assigned low score evaluations for the attributes: community counseling; family counseling; comprehensiveness - services rendered; comprehensiveness - available services. the quality of performance self-reported by the professionals of the Family Health Strategy is not perceived or valued by clients, and the actions and services may have been developed inappropriately or insufficiently to be apprehended by the experience of clients.

  8. Equity in Assistance? Usability of a U.S. Government Food Assistance Application

    ERIC Educational Resources Information Center

    Saal, Leah Katherine

    2016-01-01

    This article focuses on the quantitative phase of a multiphase mixed methods study investigating adults' and families' access to government food assistance. The research evaluates participants' comprehension of, and ability to, adequately complete authentic complex texts--national food assistance application documents. Summative usability testing…

  9. A Meta-Analysis of the Taped Problems Intervention

    ERIC Educational Resources Information Center

    Kleinert, Whitney L.; Codding, Robin S.; Minami, Takuya; Gould, Kaitlin

    2018-01-01

    Taped problems is an intervention strategy for addressing mathematics fluency that has been evaluated in multiple single-case design studies. Although its efficacy has been supported in individual studies, no comprehensive quantitative synthesis has been conducted on taped problems. The purpose of this study was to synthesize the literature that…

  10. Evaluation of Enhanced Comprehensive 2-D Gas Chromatography-Time-Of-Flight Mass Spectrometry for the Separation of Recalcitrant Polychlorinated Biphenyl Isomers

    EPA Science Inventory

    The separation of some recalcitrant polychlorinated biphenyl (PCB) isomers in extracts from environmental compartments has been a daunting task for environmental chemists. Summed quantitation values for coeluting PCB isomers are often reported. This composite data obscures the ac...

  11. Evaluating Sustainable Development Solutions Quantitatively: Competence Modelling for GCE and ESD

    ERIC Educational Resources Information Center

    Böhm, Marko; Eggert, Sabina; Barkmann, Jan; Bögeholz, Susanne

    2016-01-01

    To comprehensively address global environmental challenges such as biodiversity loss, citizens need an understanding of the socio-economic fundamentals of human behaviour in relation to natural resources. We argue that Global Citizenship Education and Education for Sustainable Development provide a core set of socio-economic competencies that can…

  12. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  13. Impact evaluation of a health promotion-focused organisational development strategy on a health service's capacity to deliver comprehensive primary health care.

    PubMed

    Costello, Michelle; Taylor, Jane; O'Hara, Lily

    2015-01-01

    A comprehensive primary health care approach is required to address complex health issues and reduce inequities. However, there has been limited uptake of this approach by health services nationally or internationally. Reorienting health services towards becoming more health promoting provides a mechanism to support the delivery of comprehensive primary health care. The aim of this study was to determine the impact of a health promotion-focused organisational development strategy on the capacity of a primary health care service to deliver comprehensive primary health care. A questionnaire and semistructured individual interviews were used to collect quantitative and qualitative impact evaluation data, respectively, from 13 health service staff across three time points with regard to 37 indicators of organisational capacity. There were significant increases in mean scores for 31 indicators, with effect sizes ranging from moderate to nearly perfect. A range of key enablers and barriers to support the delivery of comprehensive primary health care was identified. In conclusion, an organisational development strategy to reorient health services towards becoming more health promoting may increase the capacity to deliver comprehensive primary health care.

  14. Developing a multi-joint upper limb exoskeleton robot for diagnosis, therapy, and outcome evaluation in neurorehabilitation.

    PubMed

    Ren, Yupeng; Kang, Sang Hoon; Park, Hyung-Soon; Wu, Yi-Ning; Zhang, Li-Qun

    2013-05-01

    Arm impairments in patients post stroke involve the shoulder, elbow and wrist simultaneously. It is not very clear how patients develop spasticity and reduced range of motion (ROM) at the multiple joints and the abnormal couplings among the multiple joints and the multiple degrees-of-freedom (DOF) during passive movement. It is also not clear how they lose independent control of individual joints/DOFs and coordination among the joints/DOFs during voluntary movement. An upper limb exoskeleton robot, the IntelliArm, which can control the shoulder, elbow, and wrist, was developed, aiming to support clinicians and patients with the following integrated capabilities: 1) quantitative, objective, and comprehensive multi-joint neuromechanical pre-evaluation capabilities aiding multi-joint/DOF diagnosis for individual patients; 2) strenuous and safe passive stretching of hypertonic/deformed arm for loosening up muscles/joints based on the robot-aided diagnosis; 3) (assistive/resistive) active reaching training after passive stretching for regaining/improving motor control ability; and 4) quantitative, objective, and comprehensive neuromechanical outcome evaluation at the level of individual joints/DOFs, multiple joints, and whole arm. Feasibility of the integrated capabilities was demonstrated through experiments with stroke survivors and healthy subjects.

  15. Automated quantitative assessment of proteins' biological function in protein knowledge bases.

    PubMed

    Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter

    2008-01-01

    Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  16. Impact of format and content of visual display of data on comprehension, choice and preference: a systematic review.

    PubMed

    Hildon, Zoe; Allwood, Dominique; Black, Nick

    2012-02-01

    Displays comparing the performance of healthcare providers are largely based on commonsense. To review the literature on the impact of compositional format and content of quantitative data displays on people's comprehension, choice and preference. Ovid databases, expert recommendations and snowballing techniques. Evaluations of the impact of different formats (bar charts, tables and pictographs) and content (ordering, explanatory visual cues, etc.) of quantitative data displays meeting defined quality criteria. Data extraction Type of decision; decision-making domains; audiences; formats; content; methodology; findings. Most of the 30 studies used quantitative (n= 26) methods with patients or public groups (n= 28) rather than with professionals (n= 2). Bar charts were the most frequent format, followed by pictographs and tables. As regards format, tables and pictographs appeared better understood than bar charts despite the latter being preferred. Although accessible to less numerate and older populations, pictographs tended to lead to more risk avoidance. Tables appeared accessible to all. Aspects of content enhancing the impact of data displays included giving visual explanatory cues and contextual information while still attempting simplicity ('less is more'); ordering data; consistency. Icons rather than numbers were more user-friendly but could lead to over-estimation of risk. Uncertainty was not widely understood, nor well represented. Though heterogeneous and limited in scope, there is sufficient research evidence to inform the presentation of quantitative data that compares the performance of healthcare providers. The impact of new formats, such as funnel plots, needs to be evaluated.

  17. A comprehensively quantitative method of evaluating the impact of drought on crop yield using daily multi-scale SPEI and crop growth process model.

    PubMed

    Wang, Qianfeng; Wu, Jianjun; Li, Xiaohan; Zhou, Hongkui; Yang, Jianhua; Geng, Guangpo; An, Xueli; Liu, Leizhen; Tang, Zhenghong

    2017-04-01

    The quantitative evaluation of the impact of drought on crop yield is one of the most important aspects in agricultural water resource management. To assess the impact of drought on wheat yield, the Environmental Policy Integrated Climate (EPIC) crop growth model and daily Standardized Precipitation Evapotranspiration Index (SPEI), which is based on daily meteorological data, are adopted in the Huang Huai Hai Plain. The winter wheat crop yields are estimated at 28 stations, after calibrating the cultivar coefficients based on the experimental site data, and SPEI data was taken 11 times across the growth season from 1981 to 2010. The relationship between estimated yield and multi-scale SPEI were analyzed. The optimum time scale SPEI to monitor drought during the crop growth period was determined. The reference yield was determined by averaging the yields from numerous non-drought years. From this data, we propose a comprehensive quantitative method which can be used to predict the impact of drought on wheat yields by combining the daily multi-scale SPEI and crop growth process model. This method was tested in the Huang Huai Hai Plain. The results suggested that estimation of calibrated EPIC was a good predictor of crop yield in the Huang Huai Hai Plain, with lower RMSE (15.4 %) between estimated yield and observed yield at six agrometeorological stations. The soil moisture at planting time was affected by the precipitation and evapotranspiration during the previous 90 days (about 3 months) in the Huang Huai Hai Plain. SPEI G90 was adopted as the optimum time scale SPEI to identify the drought and non-drought years, and identified a drought year in 2000. The water deficit in the year 2000 was significant, and the rate of crop yield reduction did not completely correspond with the volume of water deficit. Our proposed comprehensive method which quantitatively evaluates the impact of drought on crop yield is reliable. The results of this study further our understanding why the adoption of counter measures against drought is important and direct farmers to choose drought-resistant crops.

  18. 75 FR 3966 - Agency Information Collection Activities: Proposed Information Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-25

    ... Comprehensive Quantitative Impact Study.'' DATES: You should submit comments by March 26, 2010. ADDRESSES... requesting approval of the following new information collection: Title: Basel Comprehensive Quantitative... quantitative impact study (QIS) to assess the impact of the proposed revisions that were published by the Basel...

  19. Defining an Analytic Framework to Evaluate Quantitative MRI Markers of Traumatic Axonal Injury: Preliminary Results in a Mouse Closed Head Injury Model

    PubMed Central

    Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.

    2017-01-01

    Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972

  20. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  1. Automatized image processing of bovine blastocysts produced in vitro for quantitative variable determination

    NASA Astrophysics Data System (ADS)

    Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia

    2017-12-01

    There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.

  2. DWI at MR Enterography for Evaluating Bowel Inflammation in Crohn Disease.

    PubMed

    Park, Seong Ho

    2016-07-01

    The purpose of this article is to provide a comprehensive review regarding DWI enterography used for evaluating Crohn disease and to summarize the relevant evidence. Active bowel inflammation in Crohn disease causes restricted diffusion on MR enterography with DWI. Enterographic DWI to evaluate Crohn disease is increasingly drawing attention for both academic research and clinical practice and has shown potential as a quantitative tool for assessing bowel inflammation. DWI enterography also has multiple unresolved issues and limitations.

  3. Hawaii Demonstration Project to Avert Unintended Teenage Pregnancy: 1978-1982. Final Report. Executive Summary.

    ERIC Educational Resources Information Center

    Levitt-Merin, Marta; Sutter, Sharon Kingdon

    This final report provides a descriptive overview of three approaches which the Hawaii Demonstration Project initiated to reduce unintended teenage pregnancies. Project evaluation findings are summarized; both qualitative and quantitative data are presented for a comprehensive picture of the project and its input. Project limitations and successes…

  4. Quality evaluation of Shenmaidihuang Pills based on the chromatographic fingerprints and simultaneous determination of seven bioactive constituents.

    PubMed

    Liu, Sifei; Zhang, Guangrui; Qiu, Ying; Wang, Xiaobo; Guo, Lihan; Zhao, Yanxin; Tong, Meng; Wei, Lan; Sun, Lixin

    2016-12-01

    In this study, we aimed to establish a comprehensive and practical quality evaluation system for Shenmaidihuang pills. A simple and reliable high-performance liquid chromatography coupled with photodiode array detection method was developed both for fingerprint analysis and quantitative determination. In fingerprint analysis, relative retention time and relative peak area were used to identify the common peaks in 18 samples for investigation. Twenty one peaks were selected as the common peaks to evaluate the similarities of 18 Shenmaidihuang pills samples with different manufacture dates. Furthermore, similarity analysis was applied to evaluate the similarity of samples. Hierarchical cluster analysis and principal component analysis were also performed to evaluate the variation of Shenmaidihuang pills. In quantitative analysis, linear regressions, injection precisions, recovery, repeatability and sample stability were all tested and good results were obtained to simultaneously determine the seven identified compounds, namely, 5-hydroxymethylfurfural, morroniside, loganin, paeonol, paeoniflorin, psoralen, isopsoralen in Shenmaidihuang pills. The contents of some analytes in different batches of samples indicated significant difference, especially for 5-hydroxymethylfurfural. So, it was concluded that the chromatographic fingerprint method obtained by high-performance liquid chromatography coupled with photodiode array detection associated with multiple compounds determination is a powerful and meaningful tool to comprehensively conduct the quality control of Shenmaidihuang pills. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. [Study on the quantitative evaluation on the degree of TCM basic syndromes often encountered in patients with primary liver cancer].

    PubMed

    Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng

    2007-07-01

    To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.

  6. 75 FR 18571 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-12

    ... Comprehensive Quantitative Impact Study.'' The OCC has also given notice that it has sent this collection to OMB... following new information collection: Title: Basel Comprehensive Quantitative Impact Study. OMB Control No... the Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to...

  7. Tracking Patient Education Documentation across Time and Care Settings

    PubMed Central

    Janousek, Lisa; Heermann, Judith; Eilers, June

    2005-01-01

    Results of a formative evaluation of a patient education documentation system will be presented. Both quantitative and qualitative approaches to data collection are being used. The goal of integrating patient education documentation into the electronic patient record is to facilitate seamless, multidisciplinary patient/family education across time and settings. The system is being piloted by oncology services at The Nebraska Medical Center. The evaluation addresses the usability and comprehensiveness of the system. PMID:16779280

  8. Simultaneous fingerprint, quantitative analysis and anti-oxidative based screening of components in Rhizoma Smilacis Glabrae using liquid chromatography coupled with Charged Aerosol and Coulometric array Detection.

    PubMed

    Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong

    2017-04-01

    An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Comprehensive benefit evaluation of direct power-purchase for large consumers

    NASA Astrophysics Data System (ADS)

    Liu, D. N.; Li, Z. H.; Zhou, H. M.; Zhao, Q.; Xu, X. F.

    2017-06-01

    Based on "several opinions of the CPC Central Committee and the State Council on further deepening the reform of electric power system" in 2015, this paper analyses the influence of direct power-purchase for large consumers on operation benefit of power grid. In three aspects, such as economic benefit, cleaning benefit and social benefit, the index system is proposed. In which, the profit of saving coal energy consumption, reducing carbon emissions and reducing pollutant emissions is quantitative calculated. Then the subjective and objective weights and index scores are figured out through the analytic hierarchy process, entropy weight method and interval number method. Finally, the comprehensive benefit is evaluated combined with the actual study, and some suggestions are made.

  10. Improving Transition Behaviors in Students with Autism Spectrum Disorders: A Comprehensive Evaluation of Interventions in Educational Settings

    ERIC Educational Resources Information Center

    Lequia, Jenna; Wilkerson, Kimber L.; Kim, Sunyoung; Lyons, Gregory L.

    2015-01-01

    Students with autism spectrum disorders (ASD) often exhibit rigidity, which can lead to difficulties with transitions. Such difficulties can explain why students with ASD are placed in more restrictive educational environments. This review offers a quantitative synthesis of effects of interventions aimed to improve transitions of students with ASD…

  11. Assessment of Scheduling and Plan Execution of Apollo 14 Lunar Surface Operations

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.

    2010-01-01

    Although over forty years have passed since first landing on the Moon, there is not yet a comprehensive, quantitative assessment of Apollo extravehicular activities (EVAs). Quantitatively evaluating lunar EVAs will provide a better understanding of the challenges involved with surface operations. This first evaluation of a surface EVA centers on comparing the planned and the as-ran timeline, specifically collecting data on discrepancies between durations that were estimated versus executed. Differences were summarized by task categories in order to gain insight as to the type of surface operation activities that were most challenging. One Apollo 14 EVA was assessed utilizing the described methodology. Selected metrics and task categorizations were effective, and limitations to this process were identified.

  12. Influences of the current density on the performances of the chrome-plated layer in deterministic electroplating repair

    NASA Astrophysics Data System (ADS)

    Xia, H.; Shen, X. M.; Yang, X. C.; Xiong, Y.; Jiang, G. L.

    2018-01-01

    Deterministic electroplating repair is a novel method for rapidly repairing the attrited parts. By the qualitative contrast and quantitative comparison, influences of the current density on performances of the chrome-plated layer were concluded in this study. The chrome-plated layers were fabricated under different current densities when the other parameters were kept constant. Hardnesses, thicknesses and components, surface morphologies and roughnesses, and wearability of the chrome-plated layers were detected by the Vickers hardness tester, scanning electron microscope / energy dispersive X-ray detector, digital microscope in the 3D imaging mode, and the ball-milling instrument with profilograph, respectively. In order to scientifically evaluate each factor, the experimental data was normalized. A comprehensive evaluation model was founded to quantitative analyse influence of the current density based on analytic hierarchy process method and the weighted evaluation method. The calculated comprehensive evaluation indexes corresponding to current density of 40A/dm2, 45A/dm2, 50A/dm2, 55A/dm2, 60A/dm2, and 65A/dm2 were 0.2246, 0.4850, 0.4799, 0.4922, 0.8672, and 0.1381, respectively. Experimental results indicate that final optimal option was 60A/dm2, and the priority orders were 60A/dm2, 55A/dm2, 45A/dm2, 50A/dm2, 40A/dm2, and 65A/dm2.

  13. Occupational hazard evaluation model underground coal mine based on unascertained measurement theory

    NASA Astrophysics Data System (ADS)

    Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya

    2017-05-01

    In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.

  14. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  15. The Structure of an Early Reading Test in Grade 1: In Search of a Relationship with Reading in Spanish

    ERIC Educational Resources Information Center

    Lara, Monica

    2010-01-01

    This study examined the "Tejas LEE or El Inventario de Lectura en Espanol de Tejas" (Grade 1) to determine if a relationship existed between reading comprehension in Spanish and the tested skills on the diagnostic assessment. This quantitative research design evaluated the psychometric characteristics of the Tejas LEE and followed customary…

  16. Correlation of Oxygenated Hemoglobin Concentration and Psychophysical Amount on Speech Recognition

    NASA Astrophysics Data System (ADS)

    Nozawa, Akio; Ide, Hideto

    The subjective understanding on oral language understanding task is quantitatively evaluated by the fluctuation of oxygenated hemoglobin concentration measured by the near-infrared spectroscopy. The English listening comprehension test wihch consists of two difficulty level was executed by 4 subjects during the measurement. A significant correlation was found between the subjective understanding and the fluctuation of oxygenated hemoglobin concentration.

  17. Nonlinear optical microscopy and ultrasound imaging of human cervical structure

    NASA Astrophysics Data System (ADS)

    Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.; Hall, Timothy J.

    2013-03-01

    The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth.

  18. [Urban ecological land in Changsha City: its quantitative analysis and optimization].

    PubMed

    Li, Xiao-Li; Zeng, Guang-Ming; Shi, Lin; Liang, Jie; Cai, Qing

    2010-02-01

    In this paper, a hierarchy index system suitable for catastrophe progression method was constructed to comprehensively analyze and evaluate the status of ecological land construction in Changsha City in 2007. Based on the evaluation results, the irrationalities of the distribution pattern of Changsha urban ecological land were discussed. With the support of geographic information system (GIS), the ecological corridors of the urban ecological land were constructed by using the 'least-cost' modeling, and, in combining with conflict analysis, the optimum project of the urban ecological land was put forward, forming an integrated evaluation system. The results indicated that the ecological efficiency of urban ecological land in Changsha in 2007 was at medium level, with an evaluation value being 0.9416, and the quantitative index being relatively high but the coordination index being relatively low. The analysis and verification with software Fragstats showed that the ecological efficiency of the urban ecological land after optimization was higher, with the evaluation value being 0.9618, and the SHDI, CONTAG, and other indices also enhanced.

  19. Quantitative detection of caffeine in human skin by confocal Raman spectroscopy--A systematic in vitro validation study.

    PubMed

    Franzen, Lutz; Anderski, Juliane; Windbergs, Maike

    2015-09-01

    For rational development and evaluation of dermal drug delivery, the knowledge of rate and extent of substance penetration into the human skin is essential. However, current analytical procedures are destructive, labor intense and lack a defined spatial resolution. In this context, confocal Raman microscopy bares the potential to overcome current limitations in drug depth profiling. Confocal Raman microscopy already proved its suitability for the acquisition of qualitative penetration profiles, but a comprehensive investigation regarding its suitability for quantitative measurements inside the human skin is still missing. In this work, we present a systematic validation study to deploy confocal Raman microscopy for quantitative drug depth profiling in human skin. After we validated our Raman microscopic setup, we successfully established an experimental procedure that allows correlating the Raman signal of a model drug with its controlled concentration in human skin. To overcome current drawbacks in drug depth profiling, we evaluated different modes of peak correlation for quantitative Raman measurements and offer a suitable operating procedure for quantitative drug depth profiling in human skin. In conclusion, we successfully demonstrate the potential of confocal Raman microscopy for quantitative drug depth profiling in human skin as valuable alternative to destructive state-of-the-art techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. [Comprehensive evaluation of eco-tourism resources in Yichun forest region of Northeast China].

    PubMed

    Huang, Maozhu; Hu, Haiqing; Zhang, Jie; Chen, Lijun

    2006-11-01

    By using analytical hierarchy process (AHP) and Delphi method, a total of 30 representative evaluation factors in the aspects of tourism resources quantity, environmental quantity, tourism conditions, and tourism functions were chosen to build up a comprehensive quantitative evaluation model to evaluate the eco-tourism resources of Yichun forest region in Northeast China. The results showed that in Yichun forest region, the natural eco-tourism resources were superior to the humanity resources. On the regional distribution of favorable level eco-tourism resources quantity, 4 sites were very prominent, i.e., north (Jiayin) -center (Yichun) -east (Jinshantun) -south (Tieli). As for the distribution of eco-tourism resources type, it was basically in the sequence of north (Jiayin, Tangwang River, Wuying) -center (Yichun, Shangganling) -east (Jinshantun, Meixi) -south (Teli, Dailing). Based on the above analyses, Yichun forest region could be divided into four tourism areas, i.e., the south, the east, the central, and the north. Aimed at the special features of each area, the initial development directions were introduced.

  1. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  2. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  3. The Effectiveness of Psychotherapeutic Interventions for Bereaved Persons: A Comprehensive Quantitative Review

    ERIC Educational Resources Information Center

    Currier, Joseph M.; Neimeyer, Robert A.; Berman, Jeffrey S.

    2008-01-01

    Previous quantitative reviews of research on psychotherapeutic interventions for bereaved persons have yielded divergent findings and have not included many of the available controlled outcome studies. This meta-analysis summarizes results from 61 controlled studies to offer a more comprehensive integration of this literature. This review examined…

  4. Screening of groundwater remedial alternatives for brownfield sites: a comprehensive method integrated MCDA with numerical simulation.

    PubMed

    Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu

    2018-06-01

    Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.

  5. A systematic review of health economic evaluation in adjuvant breast radiotherapy: Quality counted by numbers.

    PubMed

    Monten, Chris; Veldeman, Liv; Verhaeghe, Nick; Lievens, Yolande

    2017-11-01

    Evolving practice in adjuvant breast radiotherapy inevitably impacts healthcare budgets. This is reflected in a rise of health economic evaluations (HEE) in this domain. The available HEE literature was analysed qualitatively and quantitatively, using available instruments. HEEs published between 1/1/2000 and 31/10/2016 were retrieved through a systematic search in Medline, Cochrane and Embase. A quality-assessment using CHEERS (Consolidated Health Economic Evaluation Reporting Standards) was translated into a quantitative score and compared with Tufts Medical Centre CEA registry and Quality of Health Economic Studies (QHES) results. Twenty cost-effectiveness analyses (CEA) and thirteen cost comparisons (CC) were analysed. In qualitative evaluation, valuation or justification of data sources, population heterogeneity and discussion on generalizability, in addition to declaration on funding, were often absent or incomplete. After quantification, the average CHEERS-scores were 74% (CI 66.9-81.1%) and 75.6% (CI 70.7-80.5%) for CEAs and CCs respectively. CEA-scores did not differ significantly from Tufts and QHES-scores. Quantitative CHEERS evaluation is feasible and yields comparable results to validated instruments. HEE in adjuvant breast radiotherapy is of acceptable quality, however, further efforts are needed to improve comprehensive reporting of all data, indispensable for assessing relevance, reliability and generalizability of results. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-01-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  7. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-04-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  8. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  9. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  10. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  11. Strength and Comprehensiveness of District School Wellness Policies Predict Policy Implementation at the School Level

    PubMed Central

    Henderson, Kathryn E; Falbe, Jennifer; Novak, Sarah A.; Wharton, Christopher; Long, Michael; O'Connell, Meghan L.; Fiore, Susan S.

    2013-01-01

    Background In 2006, all local education agencies in the United States participating in federal school meal programs were required to establish school wellness policies. The aim of this study was to document the strength and comprehensiveness of one state's written district policies using a quantitative coding tool, and test whether the strength and comprehensiveness of the written policy predicted school level implementation and practices. Methods School wellness policies from 151 Connecticut districts were evaluated using a quantitative coding system. In each district, school principal surveys were collected before and after the writing and expected implementation of wellness policies. Socio-demographic variables were assessed for each district, including enrollment, population density, political climate, racial composition and socio-economic status. Changes in school-level policy implementation before and after the federal wellness policy requirement were compared across districts by wellness policy strength, and policies were compared based on district-level demographic factors. Results Statewide, fuller implementation of nutrition and physical activity policies at the school level was reported after adoption of written policies in 2006. Districts with stronger, more comprehensive policies were more successful in implementing those policies at the school level. Some socio-demographic characteristics predicted the strength of wellness policies; larger, urban districts and districts with a greater ratio of registered Democrats to Republicans wrote stronger policies. Conclusions Written school wellness policies have the potential to promote significant improvements in the school environment. Future regulation of school wellness policies should focus on the importance of writing strong and comprehensive policies. PMID:22568461

  12. Quantitative micro-CT based coronary artery profiling using interactive local thresholding and cylindrical coordinates.

    PubMed

    Panetta, Daniele; Pelosi, Gualtiero; Viglione, Federica; Kusmic, Claudia; Terreni, Marianna; Belcari, Nicola; Guerra, Alberto Del; Athanasiou, Lambros; Exarchos, Themistoklis; Fotiadis, Dimitrios I; Filipovic, Nenad; Trivella, Maria Giovanna; Salvadori, Piero A; Parodi, Oberdan

    2015-01-01

    Micro-CT is an established imaging technique for high-resolution non-destructive assessment of vascular samples, which is gaining growing interest for investigations of atherosclerotic arteries both in humans and in animal models. However, there is still a lack in the definition of micro-CT image metrics suitable for comprehensive evaluation and quantification of features of interest in the field of experimental atherosclerosis (ATS). A novel approach to micro-CT image processing for profiling of coronary ATS is described, providing comprehensive visualization and quantification of contrast agent-free 3D high-resolution reconstruction of full-length artery walls. Accelerated coronary ATS has been induced by high fat cholesterol-enriched diet in swine and left coronary artery (LCA) harvested en bloc for micro-CT scanning and histologic processing. A cylindrical coordinate system has been defined on the image space after curved multiplanar reformation of the coronary vessel for the comprehensive visualization of the main vessel features such as wall thickening and calcium content. A novel semi-automatic segmentation procedure based on 2D histograms has been implemented and the quantitative results validated by histology. The potentiality of attenuation-based micro-CT at low kV to reliably separate arterial wall layers from adjacent tissue as well as identify wall and plaque contours and major tissue components has been validated by histology. Morphometric indexes from histological data corresponding to several micro-CT slices have been derived (double observer evaluation at different coronary ATS stages) and highly significant correlations (R2 > 0.90) evidenced. Semi-automatic morphometry has been validated by double observer manual morphometry of micro-CT slices and highly significant correlations were found (R2 > 0.92). The micro-CT methodology described represents a handy and reliable tool for quantitative high resolution and contrast agent free full length coronary wall profiling, able to assist atherosclerotic vessels morphometry in a preclinical experimental model of coronary ATS and providing a link between in vivo imaging and histology.

  13. Characterization and quantitation of polyolefin microplastics in personal-care products using high-temperature gel-permeation chromatography.

    PubMed

    Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W

    2015-02-01

    In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.

  14. Main predators of insect pests: screening and evaluation through comprehensive indices.

    PubMed

    Yang, Tingbang; Liu, Jie; Yuan, Longyu; Zhang, Yang; Peng, Yu; Li, Daiqin; Chen, Jian

    2017-11-01

    Predatory natural enemies play key functional roles in integrated pest management. However, the screening and evaluation of the main predators of insect pests has seldom been reported in the field. Here, we employed comprehensive indices for evaluating the predation of a common pest (Ectropis obliqua) by nine common spider species in Chinese tea plantations. We established the relative dominance of the spider species and their phenological overlap with the pest species, and analyzed DNA from the nine spider species using targeted real-time quantitative polymerase chain reaction to identify the residual DNA of E. obliqua. The predation rates and predation numbers per predator were estimated by the positive rates of target fragments and the residual minimum number of E. obliqua in predators' guts, respectively. The results showed that only four spider species preyed on E. obliqua, and the order of potential of the spiders to control E. obliqua from greatest to smallest was Neoscona mellotteei, Xysticus ephippiatus, Evarcha albaria and Coleosoma octomaculatum by the Z-score method. The orb-weaving spider N. mellotteei has the maximum potential as a biological control agent of E. obliqua in an integrated pest management strategy. An approach of screening and evaluating main predators of insect pests through comprehensive indices was preliminarily established. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  15. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  16. Comprehensive evaluation of environmental and economic benefits of China's urban underground transportation construction projects.

    PubMed

    Yang, Xiaobin; Chen, Zhilong; Guo, Dongjun

    2015-07-01

    Urban underground transportation projects are introduced to address problems of scarce green land and traffic pollution. As construction of urban underground transportation is still in its infancy, there is no definite quantitative measurement on whether the construction is beneficial and what influences it will place on the region in China. This study intends to construct a comprehensive evaluation method for evaluating social, economic and environmental benefits of urban underground transportation projects and proposes the concept, role and principle for evaluation of environmental and economic benefits. It figures out relationship between the environment and factors of city development. It also summarizes three relevant factors, including transportation, biophysics and social economy, and works out indicators to evaluate the influence of urban underground transportation construction. Based on Contingent Valuation Method (CVM), Cost of Illness Approach (CIA), Human Capital Approach (HCA), this paper constructs 13 monetization calculation models for social, economic and environmental benefits in response to seven aspects, namely, reducing noise pollution and air pollution, using land efficiently, improving traffic safety, reducing traffic congestion, saving shipping time and minimizing transportation costs.

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation): Comprehensive 3-year progress report for the period January 15, 1986-January 14, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1988-06-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less

  18. Recommendations for a mixed methods approach to evaluating the patient-centered medical home.

    PubMed

    Goldman, Roberta E; Parker, Donna R; Brown, Joanna; Walker, Judith; Eaton, Charles B; Borkan, Jeffrey M

    2015-03-01

    There is a strong push in the United States to evaluate whether the patient-centered medical home (PCMH) model produces desired results. The explanatory and contextually based questions of how and why PCMH succeeds in different practice settings are often neglected. We report the development of a comprehensive, mixed qualitative-quantitative evaluation set for researchers, policy makers, and clinician groups. To develop an evaluation set, the Brown Primary Care Transformation Initiative convened a multidisciplinary group of PCMH experts, reviewed the PCMH literature and evaluation strategies, developed key domains for evaluation, and selected or created methods and measures for inclusion. The measures and methods in the evaluation set (survey instruments, PCMH meta-measures, patient outcomes, quality measures, qualitative interviews, participant observation, and process evaluation) are meant to be used together. PCMH evaluation must be sufficiently comprehensive to assess and explain both the context of transformation in different primary care practices and the experiences of diverse stakeholders. In addition to commonly assessed patient outcomes, quality, and cost, it is critical to include PCMH components integral to practice culture transformation: patient and family centeredness, authentic patient activation, mutual trust among practice employees and patients, and transparency, joy, and collaboration in delivering and receiving care in a changing environment. This evaluation set offers a comprehensive methodology to enable understanding of how PCMH transformation occurs in different practice settings. This approach can foster insights about how transformation affects critical outcomes to achieve meaningful, patient-centered, high-quality, and cost-effective sustainable change among diverse primary care practices. © 2015 Annals of Family Medicine, Inc.

  19. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Analyzing Remodeling of Cardiac Tissue: A Comprehensive Approach Based on Confocal Microscopy and 3D Reconstructions

    PubMed Central

    Sachse, F. B.

    2015-01-01

    Microstructural characterization of cardiac tissue and its remodeling in disease is a crucial step in many basic research projects. We present a comprehensive approach for three-dimensional characterization of cardiac tissue at the submicrometer scale. We developed a compression-free mounting method as well as labeling and imaging protocols that facilitate acquisition of three-dimensional image stacks with scanning confocal microscopy. We evaluated the approach with normal and infarcted ventricular tissue. We used the acquired image stacks for segmentation, quantitative analysis and visualization of important tissue components. In contrast to conventional mounting, compression-free mounting preserved cell shapes, capillary lumens and extracellular laminas. Furthermore, the new approach and imaging protocols resulted in high signal-to-noise ratios at depths up to 60 μm. This allowed extensive analyses revealing major differences in volume fractions and distribution of cardiomyocytes, blood vessels, fibroblasts, myofibroblasts and extracellular space in control versus infarct border zone. Our results show that the developed approach yields comprehensive data on microstructure of cardiac tissue and its remodeling in disease. In contrast to other approaches, it allows quantitative assessment of all major tissue components. Furthermore, we suggest that the approach will provide important data for physiological models of cardiac tissue at the submicrometer scale. PMID:26399990

  1. Nonlinear optical microscopy and ultrasound imaging of human cervical structure

    PubMed Central

    Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.

    2013-01-01

    Abstract. The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth. PMID:23412434

  2. Building Capacity in the Public Utility Sectors of Basra, Iraq

    DTIC Science & Technology

    2007-10-01

    has a broad range of academic and personal interests in fields that use the creative application of quantitative analysis and systems thinking to solve ... comprehensive study by Tokyo Engineering Consultants Co., Ltd. [5] references a joint United Nations Development Program (UNDP) Iraqi Central Organization...creative solutions to solve this problem. We then develop a means to evaluate and rank these potential solutions using the stakeholder values as the

  3. Peer support for parents of children with chronic disabling conditions: a systematic review of quantitative and qualitative studies.

    PubMed

    Shilling, Val; Morris, Christopher; Thompson-Coon, Jo; Ukoumunne, Obioha; Rogers, Morwenna; Logan, Stuart

    2013-07-01

    To review the qualitative and quantitative evidence of the benefits of peer support for parents of children with disabling conditions in the context of health, well-being, impact on family, and economic and service implications. We comprehensively searched multiple databases. Eligible studies evaluated parent-to-parent support and reported on the psychological health and experience of giving or receiving support. There were no limits on the child's condition, study design, language, date, or setting. We sought to aggregate quantitative data; findings of qualitative studies were combined using thematic analysis. Qualitative and quantitative data were brought together in a narrative synthesis. Seventeen papers were included: nine qualitative studies, seven quantitative studies, and one mixed-methods evaluation. Four themes were identified from qualitative studies: (1) shared social identity, (2) learning from the experiences of others, (3) personal growth, and (4) supporting others. Some quantitative studies reported a positive effect of peer support on psychological health and other outcomes; however, this was not consistently confirmed. It was not possible to aggregate data across studies. No costing data were identified. Qualitative studies strongly suggest that parents perceive benefit from peer support programmes, an effect seen across different types of support and conditions. However, quantitative studies provide inconsistent evidence of positive effects. Further research should explore whether this dissonance is substantive or an artefact of how outcomes have been measured. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.

  4. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  5. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  6. Attention Process Training-3 to improve reading comprehension in mild aphasia: A single-case experimental design study.

    PubMed

    Lee, Jaime B; Sohlberg, McKay Moore; Harn, Beth; Horner, Robert; Cherney, Leora R

    2018-06-04

    People with aphasia frequently present with nonlinguistic deficits, in addition to their compromised language abilities, which may contribute to their problems with reading comprehension. Treatment of attention, working memory and executive control may improve reading comprehension in individuals with aphasia, particularly those with mild reading problems. This single-case experimental design study evaluated the efficacy of Attention Process Training-3, an intervention combining direct attention training and metacognitive facilitation, for improving reading comprehension in individuals with mild aphasia. A multiple baseline design across six participants was used to evaluate treatment effects. The primary outcome measure was a maze reading task. Cognitive measures were administered pre- and post-treatment. Visual inspection of graphed maze reading performance data indicated a basic effect between APT-3 and improved maze reading for three of the six participants. Quantitative analyses, using Tau-U, corroborated findings identified through visual analysis. The overall effect size was significant (Tau = .48, p = .01). Results suggest that APT-3 has the potential to improve reading in individuals with aphasia, but that it may be more efficacious under certain conditions. Treatment and participant variables, including intensity of treatment and metacognitive strategy usage, are discussed as potential influences on participants' responsiveness to APT-3.

  7. Comprehensive analysis and evaluation of big data for main transformer equipment based on PCA and Apriority

    NASA Astrophysics Data System (ADS)

    Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun

    2018-01-01

    With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.

  8. Three-Dimensional Registration for Handheld Profiling Systems Based on Multiple Shot Structured Light

    PubMed Central

    Ayaz, Shirazi Muhammad; Kim, Min Young

    2018-01-01

    In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552

  9. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  10. Research and Application of an Air Quality Early Warning System Based on a Modified Least Squares Support Vector Machine and a Cloud Model.

    PubMed

    Wang, Jianzhou; Niu, Tong; Wang, Rui

    2017-03-02

    The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable.

  11. Research and Application of an Air Quality Early Warning System Based on a Modified Least Squares Support Vector Machine and a Cloud Model

    PubMed Central

    Wang, Jianzhou; Niu, Tong; Wang, Rui

    2017-01-01

    The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable. PMID:28257122

  12. How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.

    PubMed

    Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A

    2018-05-01

    A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  13. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A Comprehensive Histological Assessment of Osteoarthritis Lesions in Mice

    PubMed Central

    McNulty, Margaret A.; Loeser, Richard F.; Davey, Cynthia; Callahan, Michael F.; Ferguson, Cristin M.; Carlson, Cathy S.

    2011-01-01

    Objective: Accurate histological assessment of osteoarthritis (OA) is critical in studies evaluating the effects of interventions on disease severity. The purpose of the present study was to develop a histological grading scheme that comprehensively and quantitatively assesses changes in multiple tissues that are associated with OA of the stifle joint in mice. Design: Two representative midcoronal sections from 158 stifle joints, including naturally occurring and surgically induced OA, were stained with H&E and Safranin-O stains. All slides were evaluated to characterize the changes present. A grading scheme that includes both measurements and semiquantitative scores was developed, and principal components analysis (PCA) was applied to the resulting data from the medial tibial plateaus. A subset of 30 tibial plateaus representing a wide range of severity was then evaluated by 4 observers. Reliability of the results was evaluated using intraclass correlation coefficients (ICCs) and area under the receiver operating characteristic (ROC) curve. Results: Five factors were retained by PCA, accounting for 74% of the total variance. Interobserver and intraobserver reproducibilities for evaluations of articular cartilage and subchondral bone were acceptable. The articular cartilage integrity and chondrocyte viability factor scores were able to distinguish severe OA from normal, minimal, mild, and moderate disease. Conclusion: This newly developed grading scheme and resulting factors characterize a range of joint changes in mouse stifle joints that are associated with OA. Overall, the newly developed scheme is reliable and reproducible, characterizes changes in multiple tissues, and provides comprehensive information regarding a specific site in the stifle joint. PMID:26069594

  15. Teaching Epidemiology at the Undergraduate Level: Considerations and Approaches.

    PubMed

    Goldmann, Emily; Stark, James H; Kapadia, Farzana; McQueen, Matthew B

    2018-06-01

    The rapid growth in undergraduate public health education has offered training in epidemiology to an increasing number of undergraduate students. Epidemiology courses introduce undergraduate students to a population health perspective and provide opportunities for these students to build essential skills and competencies such as ethical reasoning, teamwork, comprehension of scientific methods, critical thinking, quantitative and information literacy, ability to analyze public health information, and effective writing and oral communication. Taking a varied approach and incorporating active learning and assessment strategies can help engage students in the material, improve comprehension of key concepts, and further develop key competencies. In this commentary, we present examples of how epidemiology may be taught in the undergraduate setting. Evaluation of these approaches and others would be a valuable next step.

  16. Quantitative fingerprinting by headspace--two-dimensional comprehensive gas chromatography-mass spectrometry of solid matrices: some challenging aspects of the exhaustive assessment of food volatiles.

    PubMed

    Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo

    2013-10-10

    The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Restorative dentistry productivity of senior students engaged in comprehensive care.

    PubMed

    Blalock, John S; Callan, Richard S; Lazarchik, David A; Frank Caughman, W; Looney, Stephen

    2012-12-01

    In dental education, various clinical delivery models are used to educate dental students. The quantitative and qualitative measures used to assess the outcomes of these models are varied. Georgia Health Sciences University College of Dental Medicine has adopted a version of a general dentistry comprehensive care dental education hybrid model. Outcome assessments were developed to evaluate the effectiveness of this delivery model. The aim of this study was to compare the number of restorative procedures performed by senior dental students under a discipline-based model versus senior student productivity engaged in comprehensive care as part of a hybrid model. The rate of senior students' productivity in performing various restorative procedures was tracked over four years, and a comparison was made. In the first two years, the seniors operated in a discipline-based model, while in the last two years the seniors operated in a comprehensive care hybrid model. The results showed that there was a significant increase in productivity by the students in terms of direct and indirect restorations. This increase in productivity may indicate that the comprehensive care model may be a more productive model, thereby enhancing clinical experiences for the students, improving operating efficiency for the schools, and ultimately increasing clinical income.

  18. Mixed methods for telehealth research.

    PubMed

    Caffery, Liam J; Martin-Khan, Melinda; Wade, Victoria

    2017-10-01

    Mixed methods research is important to health services research because the integrated qualitative and quantitative investigation can give a more comprehensive understanding of complex interventions such as telehealth than can a single-method study. Further, mixed methods research is applicable to translational research and program evaluation. Study designs relevant to telehealth research are described and supported by examples. Quality assessment tools, frameworks to assist in the reporting and review of mixed methods research, and related methodologies are also discussed.

  19. Rapid Delivery of Cyber Capabilities: Evaluation of the Requirement for a Rapid Cyber Acquisition Process

    DTIC Science & Technology

    2012-06-01

    record (PoR) to give both a quantitative and qualitative perspective on the rapid cyber acquisitions framework . It also investigates if cyber operations...acquisition is a complex topic that does not yet have a solidified framework . To scope this research, a comprehensive review of past, present and...for AT&L is working with the DoD cyberspace community to develop a common framework for Services and Agencies to acquire capabilities for cyberspace

  20. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  1. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  2. Examining the Effects of Classroom Discussion on Students' Comprehension of Text: A Meta-Analysis

    ERIC Educational Resources Information Center

    Murphy, P. Karen; Wilkinson, Ian A. G.; Soter, Anna O.; Hennessey, Maeghan N.; Alexander, John F.

    2009-01-01

    The role of classroom discussions in comprehension and learning has been the focus of investigations since the early 1960s. Despite this long history, no syntheses have quantitatively reviewed the vast body of literature on classroom discussions for their effects on students' comprehension and learning. This comprehensive meta-analysis of…

  3. The Effects of Meta-Cognitive Instruction on Students' Reading Comprehension in Computerized Reading Contexts: A Quantitative Meta-Analysis

    ERIC Educational Resources Information Center

    Lan, Yi-Chin; Lo, Yu-Ling; Hsu, Ying-Shao

    2014-01-01

    Comprehension is the essence of reading. Finding appropriate and effective reading strategies to support students' reading comprehension has always been a critical issue for educators. This article presents findings from a meta-analysis of 17 studies of metacognitive strategy instruction on students' reading comprehension in computerized…

  4. Understanding online health information: Evaluation, tools, and strategies.

    PubMed

    Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J

    2017-02-01

    Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  6. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.

  7. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  8. Neural differences in the processing of true and false sentences: insights into the nature of 'truth' in language comprehension.

    PubMed

    Marques, J Frederico; Canessa, Nicola; Cappa, Stefano

    2009-06-01

    The inquiry on the nature of truth in language comprehension has a long history of opposite perspectives. These perspectives either consider that there are qualitative differences in the processing of true and false statements, or that these processes are fundamentally the same and only differ in quantitative terms. The present study evaluated the processing nature of true and false statements in terms of patterns of brain activity using event-related functional-Magnetic-Resonance-Imaging (fMRI). We show that when true and false concept-feature statements are controlled for relation strength/ambiguity, their processing is associated to qualitatively different processes. Verifying true statements activates the left inferior parietal cortex and the caudate nucleus, a neural correlate compatible with an extended search and matching process for particular stored information. In contrast, verifying false statements activates the fronto-polar cortex and is compatible with a reasoning process of finding and evaluating a contradiction between the sentence information and stored knowledge.

  9. a Comprehensive Review of Pansharpening Algorithms for GÖKTÜRK-2 Satellite Images

    NASA Astrophysics Data System (ADS)

    Kahraman, S.; Ertürk, A.

    2017-11-01

    In this paper, a comprehensive review and performance evaluation of pansharpening algorithms for GÖKTÜRK-2 images is presented. GÖKTÜRK-2 is the first high resolution remote sensing satellite of Turkey which was designed and built in Turkey, by The Ministry of Defence, TUBITAK-UZAY and Turkish Aerospace Industry (TUSAŞ) collectively. GÖKTÜRK-2 was launched at 18th. December 2012 in Jinguan, China and provides 2.5 meter panchromatic (PAN) and 5 meter multispectral (MS) spatial resolution satellite images. In this study, a large number of pansharpening algorithms are implemented and evaluated for performance on multiple GÖKTÜRK-2 satellite images. Quality assessments are conducted both qualitatively through visual results and quantitatively using Root Mean Square Error (RMSE), Correlation Coefficient (CC), Spectral Angle Mapper (SAM), Erreur Relative Globale Adimensionnelle de Synthése (ERGAS), Peak Signal to Noise Ratio (PSNR), Structural Similarity Index (SSIM) and Universal Image Quality Index (UIQI).

  10. Language Skills in Classical Chinese Text Comprehension

    ERIC Educational Resources Information Center

    Lau, Kit-ling

    2018-01-01

    This study used both quantitative and qualitative methods to explore the role of lower- and higher-level language skills in classical Chinese (CC) text comprehension. A CC word and sentence translation test, text comprehension test, and questionnaire were administered to 393 Secondary Four students; and 12 of these were randomly selected to…

  11. Alleviating Comprehension Problems in Movies. Working Paper.

    ERIC Educational Resources Information Center

    Tatsuki, Donna

    This paper describes the various barriers to comprehension that learners may encounter when viewing feature films in a second language. Two clusters of interfacing factors that may contribute to comprehension hot spots emerged from a quantitative analysis of problems noted in student logbooks. One cluster had a strong acoustic basis, whereas the…

  12. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract

    PubMed Central

    2017-01-01

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513

  13. The emerging science of quantitative imaging biomarkers terminology and definitions for scientific studies and regulatory submissions.

    PubMed

    Kessler, Larry G; Barnhart, Huiman X; Buckler, Andrew J; Choudhury, Kingshuk Roy; Kondratovich, Marina V; Toledano, Alicia; Guimaraes, Alexander R; Filice, Ross; Zhang, Zheng; Sullivan, Daniel C

    2015-02-01

    The development and implementation of quantitative imaging biomarkers has been hampered by the inconsistent and often incorrect use of terminology related to these markers. Sponsored by the Radiological Society of North America, an interdisciplinary group of radiologists, statisticians, physicists, and other researchers worked to develop a comprehensive terminology to serve as a foundation for quantitative imaging biomarker claims. Where possible, this working group adapted existing definitions derived from national or international standards bodies rather than invent new definitions for these terms. This terminology also serves as a foundation for the design of studies that evaluate the technical performance of quantitative imaging biomarkers and for studies of algorithms that generate the quantitative imaging biomarkers from clinical scans. This paper provides examples of research studies and quantitative imaging biomarker claims that use terminology consistent with these definitions as well as examples of the rampant confusion in this emerging field. We provide recommendations for appropriate use of quantitative imaging biomarker terminological concepts. It is hoped that this document will assist researchers and regulatory reviewers who examine quantitative imaging biomarkers and will also inform regulatory guidance. More consistent and correct use of terminology could advance regulatory science, improve clinical research, and provide better care for patients who undergo imaging studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. Economic geology of lunar Helium-3

    NASA Technical Reports Server (NTRS)

    Schmitt, Harrison H.

    1988-01-01

    Economic geology evaluation of lunar He-3 should answer the question: Can lunar He-3 be sold on Earth with sufficient profit margins and low enough risk to attract capital investment in the enterprise. Concepts that relate to economic geology of recovering He-3 from the lunar maria are not new to human experience. A parametric cost and technology evaluation scheme, based on existing and future data, is required to qualitatively and quantitatively assess the comprehensive economic feasibility and return on investment of He-3 recovery from the lunar maria. There are also many political issues which must be considered as a result of nuclear fusion and lunar mining.

  15. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  16. Development and application of a new grey dynamic hierarchy analysis system (GDHAS) for evaluating urban ecological security.

    PubMed

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-05-21

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management.

  17. Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.

    PubMed

    Doyle, Kelly; Strathmann, Frederick G

    2017-02-01

    This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.

  18. Development and Application of a New Grey Dynamic Hierarchy Analysis System (GDHAS) for Evaluating Urban Ecological Security

    PubMed Central

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-01-01

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management. PMID:23698700

  19. A Novel Health Evaluation Strategy for Multifunctional Self-Validating Sensors

    PubMed Central

    Shen, Zhengguang; Wang, Qi

    2013-01-01

    The performance evaluation of sensors is very important in actual application. In this paper, a theory based on multi-variable information fusion is studied to evaluate the health level of multifunctional sensors. A novel conception of health reliability degree (HRD) is defined to indicate a quantitative health level, which is different from traditional so-called qualitative fault diagnosis. To evaluate the health condition from both local and global perspectives, the HRD of a single sensitive component at multiple time points and the overall multifunctional sensor at a single time point are defined, respectively. The HRD methodology is emphasized by using multi-variable data fusion technology coupled with a grey comprehensive evaluation method. In this method, to acquire the distinct importance of each sensitive unit and the sensitivity of different time points, the information entropy and analytic hierarchy process method are used, respectively. In order to verify the feasibility of the proposed strategy, a health evaluating experimental system for multifunctional self-validating sensors was designed. The five different health level situations have been discussed. Successful results show that the proposed method is feasible, the HRD could be used to quantitatively indicate the health level and it does have a fast response to the performance changes of multifunctional sensors. PMID:23291576

  20. Hey girlfriend: an evaluation of AIDS prevention among women in the sex industry.

    PubMed

    Dorfman, L E; Derish, P A; Cohen, J B

    1992-01-01

    Increasingly, acquired immunodeficiency syndrome (AIDS) prevention programs have been developed to reach and influence street-based populations. Standard methods of evaluation do not fit the conditions of such programs. This article describes a process and outcome evaluation of an AIDS prevention program for sex workers in which qualitative and quantitative methods were combined in order to mediate research problems endemic to street-based populations. Methods included epidemiological questionnaires, open-ended interviews with participants, and ethnographic field notes. Process evaluation findings show that field staff who were indigenous to the neighborhood and population readily gained access to the community of sex workers and simultaneously became role models for positive behavior change. Outcome findings show that sex workers do feel at risk for AIDS, but usually from clients rather than from husbands or boyfriends. Accordingly, they use condoms more frequently with clients than with steady partners. Increasing condom use among sex workers with their steady partners remains an important challenge for AIDS prevention. Combining qualitative and quantitative research data provided a more comprehensive assessment of how to reach sex workers with effective AIDS risk reduction messages than either method could have provided alone.

  1. Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?

    PubMed

    Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C

    2017-09-01

    Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.

  2. Clinical applications of advanced magnetic resonance imaging techniques for arthritis evaluation

    PubMed Central

    Martín Noguerol, Teodoro; Luna, Antonio; Gómez Cabrera, Marta; Riofrio, Alexie D

    2017-01-01

    Magnetic resonance imaging (MRI) has allowed a comprehensive evaluation of articular disease, increasing the detection of early cartilage involvement, bone erosions, and edema in soft tissue and bone marrow compared to other imaging techniques. In the era of functional imaging, new advanced MRI sequences are being successfully applied for articular evaluation in cases of inflammatory, infectious, and degenerative arthropathies. Diffusion weighted imaging, new fat suppression techniques such as DIXON, dynamic contrast enhanced-MRI, and specific T2 mapping cartilage sequences allow a better understanding of the physiopathological processes that underlie these different arthropathies. They provide valuable quantitative information that aids in their differentiation and can be used as potential biomarkers of articular disease course and treatment response. PMID:28979849

  3. Teaching Reading Comprehension to Learners with Autism Spectrum Disorder: Predictors of Teacher Self-Efficacy and Outcome Expectancy

    ERIC Educational Resources Information Center

    Accardo, Amy L.; Finnegan, Elizabeth G.; Gulkus, Steven P.; Papay, Clare K.

    2017-01-01

    Learners with autism spectrum disorder (ASD) often exhibit difficulty in the area of reading comprehension. Research connecting the learning needs of individuals with ASD, existing effective practices, teacher training, and teacher perceptions of their own ability to teach reading comprehension is scarce. Quantitative survey methodology and…

  4. Supporting Social Studies Reading Comprehension with an Electronic Pop-Up Dictionary

    ERIC Educational Resources Information Center

    Fry, Sara Winstead; Gosky, Ross

    2008-01-01

    This study investigated how middle school students' comprehension was impacted by reading social studies texts online with a pop-up dictionary function for every word in the text. A quantitative counterbalance design was used to determine how 129 middle school students' reading comprehension test scores for the pop-up dictionary reading differed…

  5. Evaluation of Rice Resistance to Southern Rice Black-Streaked Dwarf Virus and Rice Ragged Stunt Virus through Combined Field Tests, Quantitative Real-Time PCR, and Proteome Analysis.

    PubMed

    Wang, Zhenchao; Yu, Lu; Jin, Linhong; Wang, Wenli; Zhao, Qi; Ran, Longlu; Li, Xiangyang; Chen, Zhuo; Guo, Rong; Wei, Yongtian; Yang, Zhongcheng; Liu, Enlong; Hu, Deyu; Song, Baoan

    2017-02-22

    Diseases caused by southern rice black-streaked dwarf virus (SRBSDV) and rice ragged stunt virus (RRSV) considerably decrease grain yield. Therefore, determining rice cultivars with high resistance to SRBSDV and RRSV is necessary. In this study, rice cultivars with high resistance to SRBSDV and RRSV were evaluated through field trials in Shidian and Mangshi county, Yunnan province, China. SYBR Green I-based quantitative real-time polymerase chain reaction (qRT-PCR) analysis was used to quantitatively detect virus gene expression levels in different rice varieties. The following parameters were applied to evaluate rice resistance: acre yield (A.Y.), incidence of infected plants (I.I.P.), virus load (V.L.), disease index (D.I.), and insect quantity (I.Q.) per 100 clusters. Zhongzheyou1 (Z1) and Liangyou2186 (L2186) were considered the most suitable varieties with integrated higher A.Y., lower I.I.P., V.L., D.I. and I.Q. In order to investigate the mechanism of rice resistance, comparative label-free shotgun liquid chromatography tandem-mass spectrometry (LC-MS/MS) proteomic approaches were applied to comprehensively describe the proteomics of rice varieties' SRBSDV tolerance. Systemic acquired resistance (SAR)-related proteins in Z1 and L2186 may result in the superior resistance of these varieties compared with Fengyouxiangzhan (FYXZ).

  6. Alloplastic total temporomandibular joint replacements: do they perform like natural joints? Prospective cohort study with a historical control.

    PubMed

    Wojczyńska, A; Leiggener, C S; Bredell, M; Ettlin, D A; Erni, S; Gallo, L M; Colombo, V

    2016-10-01

    The aim of this study was to qualitatively and quantitatively describe the biomechanics of existing total alloplastic reconstructions of temporomandibular joints (TMJ). Fifteen patients with unilateral or bilateral TMJ total joint replacements and 15 healthy controls were evaluated via dynamic stereometry technology. This non-invasive method combines three-dimensional imaging of the subject's anatomy with jaw tracking. It provides an insight into the patient's jaw joint movements in real time and provides a quantitative evaluation. The patients were also evaluated clinically for jaw opening, protrusive and laterotrusive movements, pain, interference with eating, and satisfaction with the joint replacements. The qualitative assessment revealed that condyles of bilateral total joint replacements displayed similar basic motion patterns to those of unilateral prostheses. Quantitatively, mandibular movements of artificial joints during opening, protrusion, and laterotrusion were all significantly shorter than those of controls. A significantly restricted mandibular range of motion in replaced joints was also observed clinically. Fifty-three percent of patients suffered from chronic pain at rest and 67% reported reduced chewing function. Nonetheless, patients declared a high level of satisfaction with the replacement. This study shows that in order to gain a comprehensive understanding of complex therapeutic measures, a multidisciplinary approach is needed. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. Qualitative and quantitative evaluation of human dental enamel after bracket debonding: a noncontact three-dimensional optical profilometry analysis.

    PubMed

    Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A

    2014-09-01

    The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.

  8. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  9. Nuclear medicine and quantitative imaging research (quantitative studies in radiopharmaceutical science): Comprehensive progress report, April 1, 1986-December 31, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1988-06-01

    This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less

  10. A preliminary study of DTI Fingerprinting on stroke analysis.

    PubMed

    Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo

    2014-01-01

    DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.

  11. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  12. Best performing definition of accelerated knee osteoarthritis: data from the Osteoarthritis Initiative

    PubMed Central

    Driban, Jeffrey B.; Stout, Alina C.; Lo, Grace H.; Eaton, Charles B.; Price, Lori Lyn; Lu, Bing; Barbe, Mary F.; McAlindon, Timothy E.

    2016-01-01

    Background: We evaluated agreement among several definitions of accelerated knee osteoarthritis (AKOA) and construct validity by comparing their individual associations with injury, age, obesity, and knee pain. Methods: We selected knees from the Osteoarthritis Initiative that had no radiographic knee osteoarthritis [Kellgren–Lawrence (KL) 0 or 1] at baseline and had high-quality quantitative medial joint space width (JSW) measures on two or more consecutive visits (n = 1655 knees, 1143 participants). Quantitative medial JSW was based on a semi-automated method and was location specific (x = 0.25). We compared six definitions of AKOA: stringent JSW (averaged): average JSW loss greater than 1.05 mm/year over 4 years; stringent JSW (consistent): JSW loss greater than 1.05 mm/year for at least 2 years; lenient JSW (averaged): average JSW loss greater than 0.25 mm/year over 4 years; lenient JSW (consistent): JSW loss greater than 0.25 mm/year for at least 2 years; comprehensive KL based: progression from no radiographic osteoarthritis to advance-stage osteoarthritis (KL 3 or 4; development of definite osteophyte and joint space narrowing) within 4 years; and lenient KL based: an increase of at least two KL grades within 4 years. Results: Over 4 years the incidence rate of AKOA was 0.4%, 0.8%, 15.5%, 22.1%, 12.4%, and 7.2% based on the stringent JSW (averaged and consistent), lenient JSW (averaged and consistent), lenient KL-based definition, and comprehensive KL-based definition. All but one knee that met the stringent JSW definition also met the comprehensive KL-based definition. There was fair substantial agreement between the lenient JSW (averaged), lenient KL-based, and comprehensive KL-based definitions. A comprehensive KL-based definition led to larger effect sizes for injury, age, body mass index, and average pain over 4 years. Conclusions: A comprehensive KL-based definition of AKOA may be ideal because it represents a broader definition of joint deterioration compared with those focused on just joint space or osteophytes alone. PMID:27721902

  13. Utility of Gram Staining for Evaluation of the Quality of Cystic Fibrosis Sputum Samples

    PubMed Central

    Nair, Bindu; Stapp, Jenny; Stapp, Lynn; Bugni, Linda; Van Dalfsen, Jill; Burns, Jane L.

    2002-01-01

    The microscopic examination of Gram-stained sputum specimens is very helpful in the evaluation of patients with community-acquired pneumonia and has also been recommended for use in cystic fibrosis (CF) patients. This study was undertaken to evaluate that recommendation. One hundred one sputum samples from CF patients were cultured for gram-negative bacilli and examined by Gram staining for both sputum adequacy (using the quality [Q] score) and bacterial morphology. Subjective evaluation of adequacy was also performed and categorized. Based on Q score evaluation, 41% of the samples would have been rejected despite a subjective appearance of purulence. Only three of these rejected samples were culture negative for gram-negative CF pathogens. Correlation between culture results and quantitative Gram stain examination was also poor. These data suggest that subjective evaluation combined with comprehensive bacteriology is superior to Gram staining in identifying pathogens in CF sputum. PMID:12149331

  14. Utility of gram staining for evaluation of the quality of cystic fibrosis sputum samples.

    PubMed

    Nair, Bindu; Stapp, Jenny; Stapp, Lynn; Bugni, Linda; Van Dalfsen, Jill; Burns, Jane L

    2002-08-01

    The microscopic examination of Gram-stained sputum specimens is very helpful in the evaluation of patients with community-acquired pneumonia and has also been recommended for use in cystic fibrosis (CF) patients. This study was undertaken to evaluate that recommendation. One hundred one sputum samples from CF patients were cultured for gram-negative bacilli and examined by Gram staining for both sputum adequacy (using the quality [Q] score) and bacterial morphology. Subjective evaluation of adequacy was also performed and categorized. Based on Q score evaluation, 41% of the samples would have been rejected despite a subjective appearance of purulence. Only three of these rejected samples were culture negative for gram-negative CF pathogens. Correlation between culture results and quantitative Gram stain examination was also poor. These data suggest that subjective evaluation combined with comprehensive bacteriology is superior to Gram staining in identifying pathogens in CF sputum.

  15. Looking at CIRC through Quantitative Lenses: Can It Improve the Reading Comprehension of Filipino ESL Learners?

    ERIC Educational Resources Information Center

    Gonzales, Wilkinson Daniel Wong; Torres, Patrisha Lliane

    2015-01-01

    Several pressing issues in reading instruction have prompted educators world-wide to conduct researches on how to best improve reading comprehension skills. Although there is a wealth of reading comprehension researches done with EFL learners and native English speakers, there seems to be limited published studies conducted in the Philippine ESL…

  16. Effect of Music on Reading Comprehension of Junior High School Students

    ERIC Educational Resources Information Center

    Anderson, Stacey A.; Fuller, Gerald B.

    2010-01-01

    This quantitative study was an investigation of the effect of lyrical music on reading comprehension by adolescents. Existing research has produced results that range from concluding such distraction may be detrimental to finding it could be helpful. The reading comprehension subtest of the Gates-MacGinitie Reading Tests, 4th edition (MacGinitie,…

  17. Spectral Domain Optical Coherence Tomography in Glaucoma: Qualitative and Quantitative Analysis of the Optic Nerve Head and Retinal Nerve Fiber Layer (An AOS Thesis)

    PubMed Central

    Chen, Teresa C.

    2009-01-01

    Purpose: To demonstrate that video-rate spectral domain optical coherence tomography (SDOCT) can qualitatively and quantitatively evaluate optic nerve head (ONH) and retinal nerve fiber layer (RNFL) glaucomatous structural changes. To correlate quantitative SDOCT parameters with disc photography and visual fields. Methods: SDOCT images from 4 glaucoma eyes (4 patients) with varying stages of open-angle glaucoma (ie, early, moderate, late) were qualitatively contrasted with 2 age-matched normal eyes (2 patients). Of 61 other consecutive patients recruited in an institutional setting, 53 eyes (33 patients) met inclusion/exclusion criteria for quantitative studies. Images were obtained using two experimental SDOCT systems, one utilizing a superluminescent diode and the other a titanium:sapphire laser source, with axial resolutions of about 6 μm and 3 μm, respectively. Results: Classic glaucomatous ONH and RNFL structural changes were seen in SDOCT images. An SDOCT reference plane 139 μm above the retinal pigment epithelium yielded cup-disc ratios that best correlated with masked physician disc photography cup-disc ratio assessments. The minimum distance band, a novel SDOCT neuroretinal rim parameter, showed good correlation with physician cup-disc ratio assessments, visual field mean deviation, and pattern standard deviation (P values range, .0003–.024). RNFL and retinal thickness maps correlated well with disc photography and visual field testing. Conclusions: To our knowledge, this thesis presents the first comprehensive qualitative and quantitative evaluation of SDOCT images of the ONH and RNFL in glaucoma. This pilot study provides basis for developing more automated quantitative SDOCT-specific glaucoma algorithms needed for future prospective multicenter national trials. PMID:20126502

  18. [Current situation of toxicity classification of Chinese materia medica and its research thoughts].

    PubMed

    Sun, Wenyan; Hou, Xiujuan; Wang, Bin; Zhu, Yuelan; Zhang, Shuofeng; Chang, Hongsheng; Sun, Jianning

    2012-08-01

    Toxicity of Chinese materia medica (CMM) is an important part of Chinese herbal nature theory. In clinical application, the dosage, time limitation and compatibility of CMM is mainly determined by toxicity. At present, there is no uniform toxicity classification standard for the evaluation of Chinese herbal toxicity. Therefore, it is significant to research toxicity classification of CMM. The current situation of toxicity classification of CMM is reviewed in this paper, and proposed research thoughts are as follows: the measurement of toxicity parameters, the confirmation of poisoning target organs, the investigation on toxic mechanism by serum pharmacology and toxicokinetics, the comprehensive evaluation on toxicity based on quantitative theory.

  19. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  20. Systems-Level Analysis of Innate Immunity

    PubMed Central

    Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan

    2014-01-01

    Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298

  1. Uniform modeling of bacterial colony patterns with varying nutrient and substrate

    NASA Astrophysics Data System (ADS)

    Schwarcz, Deborah; Levine, Herbert; Ben-Jacob, Eshel; Ariel, Gil

    2016-04-01

    Bacteria develop complex patterns depending on growth condition. For example, Bacillus subtilis exhibit five different patterns depending on substrate hardness and nutrient concentration. We present a unified integro-differential model that reproduces the entire experimentally observed morphology diagram at varying nutrient concentrations and substrate hardness. The model allows a comprehensive and quantitative comparison between experimental and numerical variables and parameters, such as colony growth rate, nutrient concentration and diffusion constants. As a result, the role of the different physical mechanisms underlying and regulating the growth of the colony can be evaluated.

  2. Etiologic effects and optimal intakes of foods and nutrients for risk of cardiovascular diseases and diabetes: Systematic reviews and meta-analyses from the Nutrition and Chronic Diseases Expert Group (NutriCoDE)

    PubMed Central

    Peñalvo, Jose L.; Khatibzadeh, Shahab; Singh, Gitanjali M.; Rao, Mayuree; Fahimi, Saman; Powles, John; Mozaffarian, Dariush

    2017-01-01

    Background Dietary habits are major contributors to coronary heart disease, stroke, and diabetes. However, comprehensive evaluation of etiologic effects of dietary factors on cardiometabolic outcomes, their quantitative effects, and corresponding optimal intakes are not well-established. Objective To systematically review the evidence for effects of dietary factors on cardiometabolic diseases, including comprehensively assess evidence for causality; estimate magnitudes of etiologic effects; evaluate heterogeneity and potential for bias in these etiologic effects; and determine optimal population intake levels. Methods We utilized Bradford-Hill criteria to assess probable or convincing evidence for causal effects of multiple diet-cardiometabolic disease relationships. Etiologic effects were quantified from published or de novo meta-analyses of prospective studies or randomized clinical trials, incorporating standardized units, dose-response estimates, and heterogeneity by age and other characteristics. Potential for bias was assessed in validity analyses. Optimal intakes were determined by levels associated with lowest disease risk. Results We identified 10 foods and 7 nutrients with evidence for causal cardiometabolic effects, including protective effects of fruits, vegetables, beans/legumes, nuts/seeds, whole grains, fish, yogurt, fiber, seafood omega-3s, polyunsaturated fats, and potassium; and harms of unprocessed red meats, processed meats, sugar-sweetened beverages, glycemic load, trans-fats, and sodium. Proportional etiologic effects declined with age, but did not generally vary by sex. Established optimal population intakes were generally consistent with observed national intakes and major dietary guidelines. In validity analyses, the identified effects of individual dietary components were similar to quantified effects of dietary patterns on cardiovascular risk factors and hard endpoints. Conclusions These novel findings provide a comprehensive summary of causal evidence, quantitative etiologic effects, heterogeneity, and optimal intakes of major dietary factors for cardiometabolic diseases, informing disease impact estimation and policy planning and priorities. PMID:28448503

  3. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  4. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  5. A study on reliability of power customer in distribution network

    NASA Astrophysics Data System (ADS)

    Liu, Liyuan; Ouyang, Sen; Chen, Danling; Ma, Shaohua; Wang, Xin

    2017-05-01

    The existing power supply reliability index system is oriented to power system without considering actual electricity availability in customer side. In addition, it is unable to reflect outage or customer’s equipment shutdown caused by instantaneous interruption and power quality problem. This paper thus makes a systematic study on reliability of power customer. By comparing with power supply reliability, reliability of power customer is defined and extracted its evaluation requirements. An indexes system, consisting of seven customer indexes and two contrast indexes, are designed to describe reliability of power customer from continuity and availability. In order to comprehensively and quantitatively evaluate reliability of power customer in distribution networks, reliability evaluation method is proposed based on improved entropy method and the punishment weighting principle. Practical application has proved that reliability index system and evaluation method for power customer is reasonable and effective.

  6. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Evaluation of a method based on liquid chromatography-diode array detector-tandem mass spectrometry for a rapid and comprehensive characterization of the fat-soluble vitamin and carotenoid profile of selected plant foods.

    PubMed

    Gentili, Alessandra; Caretti, Fulvia

    2011-02-04

    The feasibility of using reversed-phase liquid chromatography/diode array/tandem mass spectrometry (LC-DAD-MS/MS) for a rapid and comprehensive profiling of fat soluble vitamins and pigments in some foods of plant origin (maize flour, green and golden kiwi) was evaluated. The instrumental approach was planned for obtaining two main outcomes within the same chromatographic run: (i) the quantitative analysis of ten target analytes, whose standards are commercially available; (ii) the screening of pigments occurring in the selected matrices. The quantitative analysis was performed simultaneously for four carotenoids (lutein, zeaxanthin, β-cryptoxanthin, and β-carotene) and six compounds with fat-soluble activity (α-tocopherol, δ-tocopherol, γ-tocopherol, ergocalciferol, phylloquinone and menaquinone-4), separated on a C30 reversed-phase column and detected by atmospheric pressure chemical ionization (APCI) tandem mass spectrometry, operating in Selected Reaction Monitoring (SRM) mode. Extraction procedure was based on matrix solid-phase dispersion with recoveries of all compounds under study exceeding 78 and 60% from maize flour and kiwi, respectively. The method intra-day precision ranged between 3 and 7%, while the inter-day one was below 12%. The mild isolation conditions precluded artefacts creation, such as cis-isomerization phenomena for carotenoids. During the quantitative LC-SRM determination of the ten target analytes, the identification power of the diode array detector joined to that of the triple quadrupole (QqQ) allowed the tentatively identification of several pigments (chlorophylls and carotenoids), without the aid of standards, on the basis of: (i) the UV-vis spectra recorded in the range of 200-700nm; (ii) the expected retention time; (iii) the two SRM transitions, chosen for the target carotenoids but also common to many of isomeric carotenoids occurring in the selected foods. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Design of an impact evaluation using a mixed methods model--an explanatory assessment of the effects of results-based financing mechanisms on maternal healthcare services in Malawi.

    PubMed

    Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela

    2014-04-22

    In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.

  9. A comprehensive review on the quasi-induced exposure technique.

    PubMed

    Jiang, Xinguo; Lyles, Richard W; Guo, Runhua

    2014-04-01

    The goal is to comprehensively examine the state-of-the-art applications and methodological development of quasi-induced exposure and consequently pinpoint the future research directions in terms of implementation guidelines, limitations, and validity tests. The paper conducts a comprehensive review on approximately 45 published papers relevant to quasi-induced exposure regarding four key topics of interest: applications, responsibility assignment, validation of assumptions, and methodological development. Specific findings include that: (1) there is no systematic data screening procedure in place and how the eliminated crash data will impact the responsibility assignment is generally unknown; (2) there is a lack of necessary efforts to assess the validity of assumptions prior to its application and the validation efforts are mostly restricted to the aggregated levels due to the limited availability of exposure truth; and (3) there is a deficiency of quantitative analyses to evaluate the magnitude and directions of bias as a result of injury risks and crash avoidance ability. The paper points out the future research directions and insights in terms of the validity tests and implementation guidelines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Identification of reference genes for quantitative expression analysis using large-scale RNA-seq data of Arabidopsis thaliana and model crop plants.

    PubMed

    Kudo, Toru; Sasaki, Yohei; Terashima, Shin; Matsuda-Imai, Noriko; Takano, Tomoyuki; Saito, Misa; Kanno, Maasa; Ozaki, Soichi; Suwabe, Keita; Suzuki, Go; Watanabe, Masao; Matsuoka, Makoto; Takayama, Seiji; Yano, Kentaro

    2016-10-13

    In quantitative gene expression analysis, normalization using a reference gene as an internal control is frequently performed for appropriate interpretation of the results. Efforts have been devoted to exploring superior novel reference genes using microarray transcriptomic data and to evaluating commonly used reference genes by targeting analysis. However, because the number of specifically detectable genes is totally dependent on probe design in the microarray analysis, exploration using microarray data may miss some of the best choices for the reference genes. Recently emerging RNA sequencing (RNA-seq) provides an ideal resource for comprehensive exploration of reference genes since this method is capable of detecting all expressed genes, in principle including even unknown genes. We report the results of a comprehensive exploration of reference genes using public RNA-seq data from plants such as Arabidopsis thaliana (Arabidopsis), Glycine max (soybean), Solanum lycopersicum (tomato) and Oryza sativa (rice). To select reference genes suitable for the broadest experimental conditions possible, candidates were surveyed by the following four steps: (1) evaluation of the basal expression level of each gene in each experiment; (2) evaluation of the expression stability of each gene in each experiment; (3) evaluation of the expression stability of each gene across the experiments; and (4) selection of top-ranked genes, after ranking according to the number of experiments in which the gene was expressed stably. Employing this procedure, 13, 10, 12 and 21 top candidates for reference genes were proposed in Arabidopsis, soybean, tomato and rice, respectively. Microarray expression data confirmed that the expression of the proposed reference genes under broad experimental conditions was more stable than that of commonly used reference genes. These novel reference genes will be useful for analyzing gene expression profiles across experiments carried out under various experimental conditions.

  11. Exact comprehensive equations for the photon management properties of silicon nanowire

    PubMed Central

    Li, Yingfeng; Li, Meicheng; Li, Ruike; Fu, Pengfei; Wang, Tai; Luo, Younan; Mbengue, Joseph Michel; Trevor, Mwenya

    2016-01-01

    Unique photon management (PM) properties of silicon nanowire (SiNW) make it an attractive building block for a host of nanowire photonic devices including photodetectors, chemical and gas sensors, waveguides, optical switches, solar cells, and lasers. However, the lack of efficient equations for the quantitative estimation of the SiNW’s PM properties limits the rational design of such devices. Herein, we establish comprehensive equations to evaluate several important performance features for the PM properties of SiNW, based on theoretical simulations. Firstly, the relationships between the resonant wavelengths (RW), where SiNW can harvest light most effectively, and the size of SiNW are formulized. Then, equations for the light-harvesting efficiency at RW, which determines the single-frequency performance limit of SiNW-based photonic devices, are established. Finally, equations for the light-harvesting efficiency of SiNW in full-spectrum, which are of great significance in photovoltaics, are established. Furthermore, using these equations, we have derived four extra formulas to estimate the optimal size of SiNW in light-harvesting. These equations can reproduce majority of the reported experimental and theoretical results with only ~5% error deviations. Our study fills up a gap in quantitatively predicting the SiNW’s PM properties, which will contribute significantly to its practical applications. PMID:27103087

  12. Comprehensive and quantitative proteomic analyses of zebrafish plasma reveals conserved protein profiles between genders and between zebrafish and human.

    PubMed

    Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan

    2016-04-13

    Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish.

  13. Comprehensive and quantitative proteomic analyses of zebrafish plasma reveals conserved protein profiles between genders and between zebrafish and human

    PubMed Central

    Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan

    2016-01-01

    Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish. PMID:27071722

  14. Late paleozoic fusulinoidean gigantism driven by atmospheric hyperoxia.

    PubMed

    Payne, Jonathan L; Groves, John R; Jost, Adam B; Nguyen, Thienan; Moffitt, Sarah E; Hill, Tessa M; Skotheim, Jan M

    2012-09-01

    Atmospheric hyperoxia, with pO(2) in excess of 30%, has long been hypothesized to account for late Paleozoic (360-250 million years ago) gigantism in numerous higher taxa. However, this hypothesis has not been evaluated statistically because comprehensive size data have not been compiled previously at sufficient temporal resolution to permit quantitative analysis. In this study, we test the hyperoxia-gigantism hypothesis by examining the fossil record of fusulinoidean foraminifers, a dramatic example of protistan gigantism with some individuals exceeding 10 cm in length and exceeding their relatives by six orders of magnitude in biovolume. We assembled and examined comprehensive regional and global, species-level datasets containing 270 and 1823 species, respectively. A statistical model of size evolution forced by atmospheric pO(2) is conclusively favored over alternative models based on random walks or a constant tendency toward size increase. Moreover, the ratios of volume to surface area in the largest fusulinoideans are consistent in magnitude and trend with a mathematical model based on oxygen transport limitation. We further validate the hyperoxia-gigantism model through an examination of modern foraminiferal species living along a measured gradient in oxygen concentration. These findings provide the first quantitative confirmation of a direct connection between Paleozoic gigantism and atmospheric hyperoxia. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  15. Advances in metabolome information retrieval: turning chemistry into biology. Part I: analytical chemistry of the metabolome.

    PubMed

    Tebani, Abdellah; Afonso, Carlos; Bekri, Soumeya

    2018-05-01

    Metabolites are small molecules produced by enzymatic reactions in a given organism. Metabolomics or metabolic phenotyping is a well-established omics aimed at comprehensively assessing metabolites in biological systems. These comprehensive analyses use analytical platforms, mainly nuclear magnetic resonance spectroscopy and mass spectrometry, along with associated separation methods to gather qualitative and quantitative data. Metabolomics holistically evaluates biological systems in an unbiased, data-driven approach that may ultimately support generation of hypotheses. The approach inherently allows the molecular characterization of a biological sample with regard to both internal (genetics) and environmental (exosome, microbiome) influences. Metabolomics workflows are based on whether the investigator knows a priori what kind of metabolites to assess. Thus, a targeted metabolomics approach is defined as a quantitative analysis (absolute concentrations are determined) or a semiquantitative analysis (relative intensities are determined) of a set of metabolites that are possibly linked to common chemical classes or a selected metabolic pathway. An untargeted metabolomics approach is a semiquantitative analysis of the largest possible number of metabolites contained in a biological sample. This is part I of a review intending to give an overview of the state of the art of major metabolic phenotyping technologies. Furthermore, their inherent analytical advantages and limits regarding experimental design, sample handling, standardization and workflow challenges are discussed.

  16. A combined impact-process evaluation of a program promoting active transport to school: understanding the factors that shaped program effectiveness.

    PubMed

    Crawford, S; Garrard, J

    2013-01-01

    This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts.

  17. A Combined Impact-Process Evaluation of a Program Promoting Active Transport to School: Understanding the Factors That Shaped Program Effectiveness

    PubMed Central

    Crawford, S.; Garrard, J.

    2013-01-01

    This mixed methods study was a comprehensive impact-process evaluation of the Ride2School program in metropolitan and regional areas in Victoria, Australia. The program aimed to promote transport to school for primary school children. Qualitative and quantitative data were collected at baseline and followup from two primary schools involved in the pilot phase of the program and two matched comparison schools, and a further 13 primary schools that participated in the implementation phase of the program. Classroom surveys, structured and unstructured observations, and interviews with Ride2School program staff were used to evaluate the pilot program. For the 13 schools in the second phase of the program, parents and students completed questionnaires at baseline (N = 889) and followup (N = 761). Based on the quantitative data, there was little evidence of an overall increase in active transport to school across participating schools, although impacts varied among individual schools. Qualitative data in the form of observations, interviews, and focus group discussions with students, school staff, and program staff provided insight into the reasons for variable program impacts. This paper highlights the benefits of undertaking a mixed methods approach to evaluating active transport to school programs that enables both measurement and understanding of program impacts. PMID:23606865

  18. Clinical Pedodontics: An Approach Based on Comprehensive Care.

    ERIC Educational Resources Information Center

    And Others; Bennett, Carroll G.

    1981-01-01

    The University of Florida uses a comprehensive care system to teach clinical pedodontics. Several block clinics permit further experience with children. Details of the program are described, and quantitative results of patient treatment are compared with those of other clinical pedodontics programs. (MSE)

  19. Evaluation of patient centered medical home practice transformation initiatives.

    PubMed

    Crabtree, Benjamin F; Chase, Sabrina M; Wise, Christopher G; Schiff, Gordon D; Schmidt, Laura A; Goyzueta, Jeanette R; Malouin, Rebecca A; Payne, Susan M C; Quinn, Michael T; Nutting, Paul A; Miller, William L; Jaén, Carlos Roberto

    2011-01-01

    The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures.

  20. Impact of specific language impairment and type of school on different language subsystems.

    PubMed

    Puglisi, Marina Leite; Befi-Lopes, Debora Maria

    2016-01-01

    This study aimed to explore quantitative and qualitative effects of type of school and specific language impairment (SLI) on different language abilities. 204 Brazilian children aged from 4 to 6 years old participated in the study. Children were selected to form three groups: 1) 63 typically developing children studying in private schools (TDPri); 2) 102 typically developing children studying in state schools (TDSta); and 39 children with SLI studying in state schools (SLISta). All individuals were assessed regarding expressive vocabulary, number morphology and morphosyntactic comprehension. All language subsystems were vulnerable to both environmental (type of school) and biological (SLI) effects. The relationship between the three language measures was exactly the same to all groups: vocabulary growth correlated with age and with the development of morphological abilities and morphosyntactic comprehension. Children with SLI showed atypical errors in the comprehension test at the age of 4, but presented a pattern of errors that gradually resembled typical development. The effect of type of school was marked by quantitative differences, while the effect of SLI was characterised by both quantitative and qualitative differences.

  1. Required attention for synthesized speech perception for three levels of linguistic redundancy

    NASA Technical Reports Server (NTRS)

    Simpson, C. A.; Hart, S. G.

    1977-01-01

    The study evaluates the attention required for synthesized speech perception with reference to three levels of linguistic redundancy. Twelve commercial airline pilots were individually tested for 16 cockpit warning messages eight of which consisted of two monosyllabic key words and eight of which consisted of two polysyllabic key words. Three levels of linguistic redundancy were identified: monosyllabic words, polysyllabic words, and sentences. The experiment contained a message familiarization phase and a message recognition phase. It was found that: (1) when the messages are part of a previously learned and recently heard set, and the subject is familiar with the phrasing, the attention needed to recognize the message is not a function of the level of linguistic redundancy, and (2) there is a quantitative and qualitative difference between recognition and comprehension processes; only in the case of active comprehension does additional redundancy reduce attention requirements.

  2. NDE research efforts at the FAA Center for Aviation Systems Reliability

    NASA Technical Reports Server (NTRS)

    Thompson, Donald O.; Brasche, Lisa J. H.

    1992-01-01

    The Federal Aviation Administration-Center for Aviation Systems Reliability (FAA-CASR), a part of the Institute for Physical Research and Technology at Iowa State University, began operation in the Fall of 1990 with funding from the FAA. The mission of the FAA-CASR is to develop quantitative nondestructive evaluation (NDE) methods for aircraft structures and materials including prototype instrumentation, software, techniques, and procedures and to develop and maintain comprehensive education and training programs in aviation specific inspection procedures and practices. To accomplish this mission, FAA-CASR brings together resources from universities, government, and industry to develop a comprehensive approach to problems specific to the aviation industry. The problem areas are targeted by the FAA, aviation manufacturers, the airline industry and other members of the aviation business community. This consortium approach ensures that the focus of the efforts is on relevant problems and also facilitates effective transfer of the results to industry.

  3. Enhancing psychosocial and spiritual palliative care: Four-year results of the program of comprehensive care for people with advanced illnesses and their families in Spain.

    PubMed

    Gómez-Batiste, Xavier; Mateo-Ortega, Dolors; Lasmarías, Cristina; Novellas, Anna; Espinosa, Jose; Beas, Elba; Ela, Sara; Barbero, Javier

    2017-02-01

    We aimed to describe the overall quantitative and qualitative results of a "La Caixa" Foundation and World Health Organization Collaborating Center Program entitled "Comprehensive Care for Patients with Advanced Illnesses and their Families" after four years of experience. Qualitative and quantitative methods were employed to assess the program. Quasiexperimental, prospective, multicenter, single-group, and pretest/posttest methods were utilized to assess the quantitative data. The effectiveness of psychosocial interventions was assessed at baseline (visit 1) and after four follow-up visits. The following dimensions were assessed: mood state, discomfort, anxiety, degree of adjustment or adaptation to disease, and suffering. We also assessed the four dimensions of the spiritual pain scale: faith or spiritual beliefs, valuable faith or spiritual beliefs, meaning in life, and peace of mind/forgiveness. Qualitative analyses were performed via surveys to evaluate stakeholder satisfaction. We built 29 psychosocial support teams involving 133 professionals-mainly psychologists and social workers. During the study period, 8,964 patients and 11,810 family members attended. Significant improvements were observed in the psychosocial and spiritual dimensions assessed. Patients, family members, and stakeholders all showed high levels of satisfaction. This model of psychosocial care could serve as an example for other countries that wish to improve psychosocial and spiritual support. Our results confirm that specific psychosocial interventions delivered by well-trained experts can help to ease suffering and discomfort in end-of-life and palliative care patients, particularly those with high levels of pain or emotional distress.

  4. A quantitative systematic review of the efficacy of mobile phone interventions to improve medication adherence.

    PubMed

    Park, Linda G; Howie-Esquivel, Jill; Dracup, Kathleen

    2014-09-01

    To evaluate the characteristics and efficacy of mobile phone interventions to improve medication adherence. Secondary aims are to explore participants' acceptability and satisfaction with mobile phone interventions and to evaluate the selected studies in terms of study rigour, impact, cost and resource feasibility, generalizability and implications for nursing practice and research. Medication non-adherence is a major global challenge. Mobile phones are the most commonly used form of technology worldwide and have the potential to promote medication adherence. Guidelines from the Centre for Reviews and Dissemination were followed for this systematic review. A comprehensive search of databases (PubMed, Web of Science, CINAHL, PsycInfo, Google Chrome and Cochrane) and bibliographies from related articles was performed from January 2002-January 2013 to identify the included studies. A quantitative systematic review without meta-analysis was conducted and the selected studies were critically evaluated to extract and summarize pertinent characteristics and outcomes. The literature search produced 29 quantitative research studies related to mobile phones and medication adherence. The studies were conducted for prevention purposes as well as management of acute and chronic illnesses. All of the studies used text messaging. Eighteen studies found significant improvement in medication adherence. While the majority of investigators found improvement in medication adherence, long-term studies characterized by rigorous research methodologies, appropriate statistical and economic analyses and the test of theory-based interventions are needed to determine the efficacy of mobile phones to influence medication adherence. © 2014 John Wiley & Sons Ltd.

  5. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  6. The Functional Neuroanatomy of Male Psychosexual and Physiosexual Arousal: A Quantitative Meta-Analysis

    PubMed Central

    Poeppl, Timm B.; Langguth, Berthold; Laird, Angela R.; Eickhoff, Simon B.

    2016-01-01

    Reproductive behavior is mandatory for conservation of species and mediated by a state of sexual arousal (SA), involving both complex mental processes and bodily reactions. An early neurobehavioral model of SA proposes cognitive, emotional, motivational, and autonomic components. In a comprehensive quantitative meta-analysis on previous neuroimaging findings, we provide here evidence for distinct brain networks underlying psychosexual and physiosexual arousal. Psychosexual (i.e., mental sexual) arousal recruits brain areas crucial for cognitive evaluation, top-down modulation of attention and exteroceptive sensory processing, relevance detection and affective evaluation, as well as regions implicated in the representation of urges and in triggering autonomic processes. In contrast, physiosexual (i.e., physiological sexual) arousal is mediated by regions responsible for regulation and monitoring of initiated autonomic processes and emotions and for somatosensory processing. These circuits are interconnected by subcortical structures (putamen and claustrum) that provide exchange of sensorimotor information and crossmodal processing between and within the networks. Brain deactivations may imply attenuation of introspective processes and social cognition, but be necessary to release intrinsic inhibition of SA. PMID:23674246

  7. Readability Approaches: Implications for Turkey

    ERIC Educational Resources Information Center

    Ulusoy, Mustafa

    2006-01-01

    Finding the right fit between students' reading ability and textbooks is very important for comprehension. Readability studies aim to analyse texts to find the right fit between students and texts. In this literature review, readability studies are classified under quantitative, qualitative and combined quantitative-qualitative readability…

  8. Comprehensive Performance Evaluation for Hydrological and Nutrients Simulation Using the Hydrological Simulation Program–Fortran in a Mesoscale Monsoon Watershed, China

    PubMed Central

    Luo, Chuan; Jiang, Kaixia; Wan, Rongrong; Li, Hengpeng

    2017-01-01

    The Hydrological Simulation Program–Fortran (HSPF) is a hydrological and water quality computer model that was developed by the United States Environmental Protection Agency. Comprehensive performance evaluations were carried out for hydrological and nutrient simulation using the HSPF model in the Xitiaoxi watershed in China. Streamflow simulation was calibrated from 1 January 2002 to 31 December 2007 and then validated from 1 January 2008 to 31 December 2010 using daily observed data, and nutrient simulation was calibrated and validated using monthly observed data during the period from July 2009 to July 2010. These results of model performance evaluation showed that the streamflows were well simulated over the study period. The determination coefficient (R2) was 0.87, 0.77 and 0.63, and the Nash-Sutcliffe coefficient of efficiency (Ens) was 0.82, 0.76 and 0.65 for the streamflow simulation in annual, monthly and daily time-steps, respectively. Although limited to monthly observed data, satisfactory performance was still achieved during the quantitative evaluation for nutrients. The R2 was 0.73, 0.82 and 0.92, and the Ens was 0.67, 0.74 and 0.86 for nitrate, ammonium and orthophosphate simulation, respectively. Some issues may affect the application of HSPF were also discussed, such as input data quality, parameter values, etc. Overall, the HSPF model can be successfully used to describe streamflow and nutrients transport in the mesoscale watershed located in the East Asian monsoon climate area. This study is expected to serve as a comprehensive and systematic documentation of understanding the HSPF model for wide application and avoiding possible misuses. PMID:29257117

  9. Comprehensive Performance Evaluation for Hydrological and Nutrients Simulation Using the Hydrological Simulation Program-Fortran in a Mesoscale Monsoon Watershed, China.

    PubMed

    Li, Zhaofu; Luo, Chuan; Jiang, Kaixia; Wan, Rongrong; Li, Hengpeng

    2017-12-19

    The Hydrological Simulation Program-Fortran (HSPF) is a hydrological and water quality computer model that was developed by the United States Environmental Protection Agency. Comprehensive performance evaluations were carried out for hydrological and nutrient simulation using the HSPF model in the Xitiaoxi watershed in China. Streamflow simulation was calibrated from 1 January 2002 to 31 December 2007 and then validated from 1 January 2008 to 31 December 2010 using daily observed data, and nutrient simulation was calibrated and validated using monthly observed data during the period from July 2009 to July 2010. These results of model performance evaluation showed that the streamflows were well simulated over the study period. The determination coefficient ( R ²) was 0.87, 0.77 and 0.63, and the Nash-Sutcliffe coefficient of efficiency (Ens) was 0.82, 0.76 and 0.65 for the streamflow simulation in annual, monthly and daily time-steps, respectively. Although limited to monthly observed data, satisfactory performance was still achieved during the quantitative evaluation for nutrients. The R ² was 0.73, 0.82 and 0.92, and the Ens was 0.67, 0.74 and 0.86 for nitrate, ammonium and orthophosphate simulation, respectively. Some issues may affect the application of HSPF were also discussed, such as input data quality, parameter values, etc. Overall, the HSPF model can be successfully used to describe streamflow and nutrients transport in the mesoscale watershed located in the East Asian monsoon climate area. This study is expected to serve as a comprehensive and systematic documentation of understanding the HSPF model for wide application and avoiding possible misuses.

  10. Review and Process Effects of Spontaneous Note-Taking on Text Comprehension.

    PubMed

    Slotte; Lonka

    1999-01-01

    This study examines how quantitative and qualitative differences in spontaneously taken notes are related to text comprehension in combination with reviewing or not reviewing previously made notes. High school graduates (N = 226) were allowed to take notes in any way they desired while reading a philosophical text. Approximately half the participants were told that they could review their notes during writing tasks designed to measure the ability to define, compare, and evaluate text content. The other half of the participants answered the subsequent questions without their notes. The process of taking notes was rated on the basis of note quality and quantity. The results revealed significant review and process effects in spontaneous note-taking. Reviewing the notes during essay-writing generally resulted in good performance in an exam calling for deep-level text comprehension. However, this review effect was mainly limited to detailed learning instead of making one's own inferences. Results pertaining to note quality indicated that the participants who summarized the content of the text resulted in better performance in all tasks in comparison with those who produced notes following the text order or verbatim notes. The amount of note-taking was also positively related to text comprehension. The discussion focuses upon the situational appropriateness of note-taking effects that pose challenges to educators. Copyright 1999 Academic Press.

  11. An Overview of State Policies Supporting Worksite Health Promotion Programs.

    PubMed

    VanderVeur, Jennifer; Gilchrist, Siobhan; Matson-Koffman, Dyann

    2017-05-01

    Worksite health promotion (WHP) programs can reduce the occurrence of cardiovascular disease risk factors. State law can encourage employers and employer-provided insurance companies to offer comprehensive WHP programs. This research examines state law authorizing WHP programs. Quantitative content analysis. Worksites or workplaces. United States (and the District of Columbia). State law in effect in 2013 authorizing WHP programs. Frequency and distribution of states with WHP laws. To determine the content of the laws for analysis and coding, we identified 18 policy elements, 12 from the Centers for Disease Control and Prevention's Worksite Health ScoreCard (HSC) and 6 additional supportive WHP strategies. We used these strategies as key words to search for laws authorizing WHP programs or select WHP elements. We calculated the number and type of WHP elements for each state with WHP laws and selected two case examples from states with comprehensive WHP laws. Twenty-four states authorized onsite WHP programs, 29 authorized WHP through employer-provided insurance plans, and 18 authorized both. Seven states had a comprehensive WHP strategy, addressing 8 or more of 12 HSC elements. The most common HSC elements were weight management, tobacco cessation, and physical activity. Most states had laws encouraging the adoption of WHP programs. Massachusetts and Maine are implementing comprehensive WHP laws but studies evaluating their health impact are needed.

  12. Using GeoRePORT to report socio-economic potential for geothermal development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Levine, Aaron

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT, http://en.openei.org/wiki/GeoRePORT) was developed for reporting resource grades and project readiness levels, providing the U.S. Department of Energy a consistent and comprehensible means of evaluating projects. The tool helps funding organizations (1) quantitatively identify barriers, (2) develop measureable goals, (3) objectively evaluate proposals, including contribution to goals, (4) monitor progress, and (5) report portfolio performance. GeoRePORT assesses three categories: geological, technical, and socio-economic. Here, we describe GeoRePORT, then focus on the socio-economic assessment and its applications for assessing deployment potential in the U.S. Socio-economic attributes include land access, permitting, transmission, and market.

  13. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Dataglove measurement of joint angles in sign language handshapes

    PubMed Central

    Eccarius, Petra; Bour, Rebecca; Scheidt, Robert A.

    2012-01-01

    In sign language research, we understand little about articulatory factors involved in shaping phonemic boundaries or the amount (and articulatory nature) of acceptable phonetic variation between handshapes. To date, there exists no comprehensive analysis of handshape based on the quantitative measurement of joint angles during sign production. The purpose of our work is to develop a methodology for collecting and visualizing quantitative handshape data in an attempt to better understand how handshapes are produced at a phonetic level. In this pursuit, we seek to quantify the flexion and abduction angles of the finger joints using a commercial data glove (CyberGlove; Immersion Inc.). We present calibration procedures used to convert raw glove signals into joint angles. We then implement those procedures and evaluate their ability to accurately predict joint angle. Finally, we provide examples of how our recording techniques might inform current research questions. PMID:23997644

  15. Ted Hall and the science of biological microprobe X-ray analysis: a historical perspective of methodology and biological dividends.

    PubMed

    Gupta, B L

    1991-06-01

    This review surveys the emergence of electron probe X-ray microanalysis as a quantitative method for measuring the chemical elements in situ. The extension of the method to the biological sciences under the influence of Ted Hall is reviewed. Some classical experiments by Hall and his colleagues in Cambridge, UK, previously unpublished, are described; as are some of the earliest quantitative results from the cryo-sections obtained in Cambridge and elsewhere. The progress of the methodology is critically evaluated from the earliest starts to the present state of the art. Particular attention has been focused on the application of the method in providing fresh insights into the role of ions in cell and tissue physiology and pathology. A comprehensive list of references is included for a further pursuit of the topics by the interested reader.

  16. [Landscape classification: research progress and development trend].

    PubMed

    Liang, Fa-Chao; Liu, Li-Ming

    2011-06-01

    Landscape classification is the basis of the researches on landscape structure, process, and function, and also, the prerequisite for landscape evaluation, planning, protection, and management, directly affecting the precision and practicability of landscape research. This paper reviewed the research progress on the landscape classification system, theory, and methodology, and summarized the key problems and deficiencies of current researches. Some major landscape classification systems, e. g. , LANMAP and MUFIC, were introduced and discussed. It was suggested that a qualitative and quantitative comprehensive classification based on the ideology of functional structure shape and on the integral consideration of landscape classification utility, landscape function, landscape structure, physiogeographical factors, and human disturbance intensity should be the major research directions in the future. The integration of mapping, 3S technology, quantitative mathematics modeling, computer artificial intelligence, and professional knowledge to enhance the precision of landscape classification would be the key issues and the development trend in the researches of landscape classification.

  17. Mixed methods research - the best of both worlds?

    PubMed

    van Griensven, Hubert; Moore, Ann P; Hall, Valerie

    2014-10-01

    There has been a bias towards quantitative research approaches within manual therapy, which may have resulted in a narrow understanding of manual therapy practice. The aim of this Masterclass is to make a contribution to the expansion of methodologies used in manual therapy enquiry by discussing mixed methods research (MMR), a methodology which utilises both qualitative and quantitative methods within a single study in order to provide more comprehensive insights. To review rationales for MMR, as well as some of the common design options and potential difficulties. The paper also discusses theoretical frameworks that have been used to underpin qualitative and quantitative research, and ongoing debates about the possibility of combining them. Complexities associated with health and manual therapy cannot always be investigated satisfactorily by using a single research method. Some issues require a more comprehensive understanding, which may be provided by combining the strengths of quantitative and qualitative methods in a mixed methods study. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A mass spectrometry-based method for comprehensive quantitative determination of post-transcriptional RNA modifications: the complete chemical structure of Schizosaccharomyces pombe ribosomal RNAs

    PubMed Central

    Taoka, Masato; Nobe, Yuko; Hori, Masayuki; Takeuchi, Aiko; Masaki, Shunpei; Yamauchi, Yoshio; Nakayama, Hiroshi; Takahashi, Nobuhiro; Isobe, Toshiaki

    2015-01-01

    We present a liquid chromatography–mass spectrometry (LC-MS)-based method for comprehensive quantitative identification of post-transcriptional modifications (PTMs) of RNA. We incorporated an in vitro-transcribed, heavy isotope-labeled reference RNA into a sample RNA solution, digested the mixture with a number of RNases and detected the post-transcriptionally modified oligonucleotides quantitatively based on shifts in retention time and the MS signal in subsequent LC-MS. This allowed the determination and quantitation of all PTMs in Schizosaccharomyces pombe ribosomal (r)RNAs and generated the first complete PTM maps of eukaryotic rRNAs at single-nucleotide resolution. There were 122 modified sites, most of which appear to locate at the interface of ribosomal subunits where translation takes place. We also identified PTMs at specific locations in rRNAs that were altered in response to growth conditions of yeast cells, suggesting that the cells coordinately regulate the modification levels of RNA. PMID:26013808

  19. Evaluation of CAMEL - comprehensive areal model of earthquake-induced landslides

    USGS Publications Warehouse

    Miles, S.B.; Keefer, D.K.

    2009-01-01

    A new comprehensive areal model of earthquake-induced landslides (CAMEL) has been developed to assist in planning decisions related to disaster risk reduction. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using fuzzy logic systems and geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL has been empirically evaluated with respect to disrupted landslides (Category I) using a case study of the 1989 M = 6.9 Loma Prieta, CA earthquake. In this case, CAMEL performs best in comparison to disrupted slides and falls in soil. For disrupted rock fall and slides, CAMEL's performance was slightly poorer. The model predicted a low occurrence of rock avalanches, when none in fact occurred. A similar comparison with the Loma Prieta case study was also conducted using a simplified Newmark displacement model. The area under the curve method of evaluation was used in order to draw comparisons between both models, revealing improved performance with CAMEL. CAMEL should not however be viewed as a strict alternative to Newmark displacement models. CAMEL can be used to integrate Newmark displacements with other, previously incompatible, types of knowledge. ?? 2008 Elsevier B.V.

  20. Underage alcohol policies across 50 California cities: an assessment of best practices

    PubMed Central

    2012-01-01

    Background We pursue two primary goals in this article: (1) to test a methodology and develop a dataset on U.S. local-level alcohol policy ordinances, and (2) to evaluate the presence, comprehensiveness, and stringency of eight local alcohol policies in 50 diverse California cities in relationship to recommended best practices in both public health literature and governmental recommendations to reduce underage drinking. Methods Following best practice recommendations from a wide array of authoritative sources, we selected eight local alcohol policy topics (e.g., conditional use permits, responsible beverage service training, social host ordinances, window/billboard advertising ordinances), and determined the presence or absence as well as the stringency (restrictiveness) and comprehensiveness (number of provisions) of each ordinance in each of the 50 cities in 2009. Following the alcohol policy literature, we created scores for each city on each type of ordinance and its associated components. We used these data to evaluate the extent to which recommendations for best practices to reduce underage alcohol use are being followed. Results (1) Compiling datasets of local-level alcohol policy laws and their comprehensiveness and stringency is achievable, even absent comprehensive, on-line, or other legal research tools. (2) We find that, with some exceptions, most of the 50 cities do not have high scores for presence, comprehensiveness, or stringency across the eight key policies. Critical policies such as responsible beverage service and deemed approved ordinances are uncommon, and, when present, they are generally neither comprehensive nor stringent. Even within policies that have higher adoption rates, central elements are missing across many or most cities’ ordinances. Conclusion This study demonstrates the viability of original legal data collection in the U.S. pertaining to local ordinances and of creating quantitative scores for each policy type to reflect comprehensiveness and stringency. Analysis of the resulting dataset reveals that, although the 50 cities have taken important steps to improve public health with regard to underage alcohol use and abuse, there is a great deal more that needs to be done to bring these cities into compliance with best practice recommendations. PMID:22734468

  1. Glosses, Comprehension, and Strategy Use

    ERIC Educational Resources Information Center

    Ko, Myong Hee

    2005-01-01

    This study, using both qualitative and quantitative measures, investigates how different types of gloss conditions affect Korean college students' reading comprehension. One hundred and six undergraduates at a university in Korea participated in the study. Twelve were assigned to think aloud, and the rest (ninety-four) took part in the main study…

  2. The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension

    ERIC Educational Resources Information Center

    Ertem, Ihsan Seyit

    2010-01-01

    This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…

  3. Widely-targeted quantitative lipidomics methodology by supercritical fluid chromatography coupled with fast-scanning triple quadrupole mass spectrometry.

    PubMed

    Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi

    2018-05-03

    Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  4. Quantitative Assessment of Cancer Risk from Exposure to Diesel Engine Emissions

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. This model accounted for rat-hum...

  5. Using Quantitative Structure-Activity Relationship Modeling to Quantitatively Predict the Developmental Toxicity of Halogenated Azole compounds

    EPA Science Inventory

    Developmental toxicity is a relevant endpoint for the comprehensive assessment of human health risk from chemical exposure. However, animal developmental toxicity studies remain unavailable for many environmental contaminants due to the complexity and cost of these types of analy...

  6. Overlapping Genetic and Child-Specific Nonshared Environmental Influences on Listening Comprehension, Reading Motivation, and Reading Comprehension

    PubMed Central

    Schenker, Victoria J.; Petrill, Stephen A.

    2015-01-01

    This study investigated the genetic and environmental influences on observed associations between listening comprehension, reading motivation, and reading comprehension. Univariate and multivariate quantitative genetic models were conducted in a sample of 284 pairs of twins at a mean age of 9.81 years. Genetic and nonshared environmental factors accounted for statistically significant variance in listening and reading comprehension, and nonshared environmental factors accounted for variance in reading motivation. Furthermore, listening comprehension demonstrated unique genetic and nonshared environmental influences but also had overlapping genetic influences with reading comprehension. Reading motivation and reading comprehension each had unique and overlapping nonshared environmental contributions. Therefore, listening comprehension appears to be related to reading primarily due to genetic factors whereas motivation appears to affect reading via child-specific, nonshared environmental effects. PMID:26321677

  7. Overlapping genetic and child-specific nonshared environmental influences on listening comprehension, reading motivation, and reading comprehension.

    PubMed

    Schenker, Victoria J; Petrill, Stephen A

    2015-01-01

    This study investigated the genetic and environmental influences on observed associations between listening comprehension, reading motivation, and reading comprehension. Univariate and multivariate quantitative genetic models were conducted in a sample of 284 pairs of twins at a mean age of 9.81 years. Genetic and nonshared environmental factors accounted for statistically significant variance in listening and reading comprehension, and nonshared environmental factors accounted for variance in reading motivation. Furthermore, listening comprehension demonstrated unique genetic and nonshared environmental influences but also had overlapping genetic influences with reading comprehension. Reading motivation and reading comprehension each had unique and overlapping nonshared environmental contributions. Therefore, listening comprehension appears to be related to reading primarily due to genetic factors whereas motivation appears to affect reading via child-specific, nonshared environmental effects. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  9. Emergy evaluation of water utilization benefits in water-ecological-economic system based on water cycle process

    NASA Astrophysics Data System (ADS)

    Guo, X.; Wu, Z.; Lv, C.

    2017-12-01

    The water utilization benefits are formed by the material flow, energy flow, information flow and value stream in the whole water cycle process, and reflected along with the material circulation of inner system. But most of traditional water utilization benefits evaluation are based on the macro level, only consider the whole material input and output and energy conversion relation, and lack the characterization of water utilization benefits accompanying with water cycle process from the formation mechanism. In addition, most studies are from the perspective of economics, only pay attention to the whole economic output and sewage treatment economic investment, but neglect the ecological function benefits of water cycle, Therefore, from the perspective of internal material circulation in the whole system, taking water cycle process as the process of material circulation and energy flow, the circulation and flow process of water and other ecological environment, social economic elements were described, and the composition of water utilization positive and negative benefits in water-ecological-economic system was explored, and the performance of each benefit was analyzed. On this basis, the emergy calculation method of each benefit was proposed by emergy quantitative analysis technique, which can realize the unified measurement and evaluation of water utilization benefits in water-ecological-economic system. Then, taking Zhengzhou city as an example, the corresponding benefits of different water cycle links were calculated quantitatively by emergy method, and the results showed that the emergy evaluation method of water utilization benefits can unify the ecosystem and the economic system, achieve uniform quantitative analysis, and measure the true value of natural resources and human economic activities comprehensively.

  10. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  11. Evaluation of Patient Centered Medical Home Practice Transformation Initiatives

    PubMed Central

    Crabtree, Benjamin F.; Chase, Sabrina M.; Wise, Christopher G.; Schiff, Gordon D.; Schmidt, Laura A.; Goyzueta, Jeanette R.; Malouin, Rebecca A.; Payne, Susan M. C.; Quinn, Michael T.; Nutting, Paul A.; Miller, William L.; Jaén, Carlos Roberto

    2011-01-01

    Background The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Methods Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. Results A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Conclusions Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures. PMID:21079525

  12. 75 FR 78252 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-15

    ... research to explore issues of quantitative benefit information. They all described the collection of data... research will involve quantitative assessment of the comprehension of important information in the document... of experiences and varying degrees of satisfaction with information currently provided at the time...

  13. Performance evaluation of non-targeted peak-based cross-sample analysis for comprehensive two-dimensional gas chromatography-mass spectrometry data and application to processed hazelnut profiling.

    PubMed

    Kiefl, Johannes; Cordero, Chiara; Nicolotti, Luca; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2012-06-22

    The continuous interest in non-targeted profiling induced the development of tools for automated cross-sample analysis. Such tools were found to be selective or not comprehensive thus delivering a biased view on the qualitative/quantitative peak distribution across 2D sample chromatograms. Therefore, the performance of non-targeted approaches needs to be critically evaluated. This study focused on the development of a validation procedure for non-targeted, peak-based, GC×GC-MS data profiling. The procedure introduced performance parameters such as specificity, precision, accuracy, and uncertainty for a profiling method known as Comprehensive Template Matching. The performance was assessed by applying a three-week validation protocol based on CITAC/EURACHEM guidelines. Optimized ¹D and ²D retention times search windows, MS match factor threshold, detection threshold, and template threshold were evolved from two training sets by a semi-automated learning process. The effectiveness of proposed settings to consistently match 2D peak patterns was established by evaluating the rate of mismatched peaks and was expressed in terms of results accuracy. The study utilized 23 different 2D peak patterns providing the chemical fingerprints of raw and roasted hazelnuts (Corylus avellana L.) from different geographical origins, of diverse varieties and different roasting degrees. The validation results show that non-targeted peak-based profiling can be reliable with error rates lower than 10% independent of the degree of analytical variance. The optimized Comprehensive Template Matching procedure was employed to study hazelnut roasting profiles and in particular to find marker compounds strongly dependent on the thermal treatment, and to establish the correlation of potential marker compounds to geographical origin and variety/cultivar and finally to reveal the characteristic release of aroma active compounds. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention.

    PubMed

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2010-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan's current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 x 10(-8) (95th percentile: 3.20 x 10(-7)). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures.

  15. [The functional independence of lexical numeric knowledge and the representation of magnitude: evidence from one case].

    PubMed

    Salguero-Alcañiz, M P; Lorca-Marín, J A; Alameda-Bailén, J R

    The ultimate purpose of cognitive neuropsychology is to find out how normal cognitive processes work. To this end, it studies subjects who have suffered brain damage but who, until their accident, were competent in the skills that are later to become the object of study. It is therefore necessary to study patients who have difficulty in processing numbers and in calculating in order to further our knowledge of these processes in the normal population. Our aim was to analyse the relationships between the different cognitive processes involved in numeric knowledge. We studied the case of a female patient who suffered an ischemic infarct in the perisylvian region, on both a superficial and deep level. She presented predominantly expressive mixed aphasia and predominantly brachial hemiparesis. Numeric processing and calculation were evaluated. The patient still had her lexical numeric knowledge but her quantitative numeric knowledge was impaired. These alterations in the quantitative numeric knowledge are evidenced by the difficulties the patient had in numeric comprehension tasks, as well as the severe impairments displayed in calculation. These findings allow us to conclude that quantitative numeric knowledge is functionally independent of lexical or non-quantitative numeric knowledge. From this functional autonomy, a possible structural independence can be inferred.

  16. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    PubMed

    Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen

    2013-01-01

    ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  17. Quantitative Proteomic Analysis of Human Lung Tumor Xenografts Treated with the Ectopic ATP Synthase Inhibitor Citreoviridin

    PubMed Central

    Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen

    2013-01-01

    ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911

  18. Prediction of skin anti-aging clinical benefits of an association of ingredients from marine and maritime origins: Ex vivo evaluation using a label-free quantitative proteomic and customized data processing approach.

    PubMed

    Hameury, Sebastien; Borderie, Laurent; Monneuse, Jean-Marc; Skorski, Gilbert; Pradines, Dominique

    2018-05-23

    The application of ingredients from marine and maritime origins is increasingly common in skin care products, driven by consumer expectations for natural ingredients. However, these ingredients are typically studied for a few isolated in vitro activities. The purpose of this study was to carry out a comprehensive evaluation of the activity on the skin of an association of ingredients from marine and maritime origins using label-free quantitative proteomic analysis, in order to predict the clinical benefits if used in a skin care product. An aqueous gel containing 6.1% of ingredients from marine and maritime origins (amino acid-enriched giant kelp extract, trace element-enriched seawater, dedifferentiated sea fennel cells) was topically applied on human skin explants. The skin explants' proteome was analyzed in a label-free manner by high-performance liquid nano-chromatography coupled with tandem mass spectrometry. A specific data processing pipeline (CORAVALID) providing an objective and comprehensive interpretation of the statistically relevant biological activities processed the results. Compared to untreated skin explants, 64 proteins were significantly regulated by the gel treatment (q-value ≤ 0.05). Computer data processing revealed an activity of the ingredients on the epidermis and the dermis. These significantly regulated proteins are involved in gene expression, cell survival and metabolism, inflammatory processes, dermal extracellular matrix synthesis, melanogenesis and keratinocyte proliferation, migration, and differentiation. These results suggest that the tested ingredients could help to preserve a healthy epidermis and dermis, and possibly to prevent the visible signs of skin aging. © 2018 The Authors. Journal of Cosmetic Dermatology Published by Wiley Periodicals, Inc.

  19. Evaluation of oral microbiology lab curriculum reform.

    PubMed

    Nie, Min; Gao, Zhen Y; Wu, Xin Y; Jiang, Chen X; Du, Jia H

    2015-12-07

    According to the updated concept of oral microbiology, the School of Stomatology, Wuhan University, has carried out oral microbiology teaching reforms during the last 5 years. There was no lab curriculum before 2009 except for a theory course of oral microbiology. The school has implemented an innovative curriculum with oral medicine characteristics to strengthen understanding of knowledge, cultivate students' scientific interest and develop their potential, to cultivate the comprehensive ability of students. This study was designed to evaluate the oral microbiology lab curriculum by analyzing student performance and perceptions regarding the curriculum from 2009 to 2013. The lab curriculum adopted modalities for cooperative learning. Students collected dental plaque from each other and isolated the cariogenic bacteria with selective medium plates. Then they purified the enrichment culture medium and identified the cariogenic strains by Gram stain and biochemical tests. Both quantitative and qualitative data for 5 years were analysed in this study. Part One of the current study assessed student performance in the lab from 2009 to 2013. Part Two used qualitative means to assess students' perceptions by an open questionnaire. The 271 study students' grades on oral microbiology improved during the lab curriculum: "A" grades rose from 60.5 to 81.2 %, and "C" grades fell from 28.4 to 6.3 %. All students considered the lab curriculum to be interesting and helpful. Quantitative and qualitative data converge to suggest that the lab curriculum has strengthened students' grasp of important microbiology-related theory, cultivated their scientific interest, and developed their potential and comprehensive abilities. Our student performance and perception data support the continued use of the innovative teaching system. As an extension and complement of the theory course, the oral microbiology lab curriculum appears to improve the quality of oral medicine education and help to cultivate high-quality innovative medical talents.

  20. Are we effectively informing patients? A quantitative analysis of on-line patient education resources from the American Society of Neuroradiology.

    PubMed

    Hansberry, D R; Agarwal, N; Gonzales, S F; Baker, S R

    2014-07-01

    The ubiquitous use of the Internet by the public in an attempt to better understand their health care requires the on-line resources written at an appropriate level to maximize comprehension for the average user. The National Institutes of Health and the American Medical Association recommend on-line patient education resources written at a third-to-seventh grade level. We evaluated the readability of the patient education resources provided on the Web site of the American Society of Neuroradiology (http://www.asnr.org/patientinfo/). All patient education material from the ASNR Web site and the Society of Neurointerventional Surgery Web site were downloaded and evaluated with the computer software, Readability Studio Professional Edition, by using 10 quantitative readability scales: the Flesch Reading Ease, Flesch-Kincaid Grade Level, Simple Measure of Gobbledygook, Coleman-Liau Index, Gunning Fog Index, New Dale-Chall, FORCAST Formula, Fry Graph, Raygor Reading Estimate, and New Fog Count. An unpaired t test was used to compare the readability level of resources available on the American Society of Neuroradiology and the Society of Neurointerventional Surgery Web sites. The 20 individual patient education articles were written at a 13.9 ± 1.4 grade level with only 5% written at <11th grade level. There was no statistical difference between the level of readability of the resources on the American Society of Neuroradiology and Society of Neurointerventional Surgery Web sites. The patient education resources on these Web sites fail to meet the guidelines of the National Institutes of Health and American Medical Association. Members of the public may fail to fully understand these resources and would benefit from revisions that result in more comprehensible information cast in simpler language. © 2014 by American Journal of Neuroradiology.

  1. Quantitative analysis of essential oils in perfume using multivariate curve resolution combined with comprehensive two-dimensional gas chromatography.

    PubMed

    de Godoy, Luiz Antonio Fonseca; Hantao, Leandro Wang; Pedroso, Marcio Pozzobon; Poppi, Ronei Jesus; Augusto, Fabio

    2011-08-05

    The use of multivariate curve resolution (MCR) to build multivariate quantitative models using data obtained from comprehensive two-dimensional gas chromatography with flame ionization detection (GC×GC-FID) is presented and evaluated. The MCR algorithm presents some important features, such as second order advantage and the recovery of the instrumental response for each pure component after optimization by an alternating least squares (ALS) procedure. A model to quantify the essential oil of rosemary was built using a calibration set containing only known concentrations of the essential oil and cereal alcohol as solvent. A calibration curve correlating the concentration of the essential oil of rosemary and the instrumental response obtained from the MCR-ALS algorithm was obtained, and this calibration model was applied to predict the concentration of the oil in complex samples (mixtures of the essential oil, pineapple essence and commercial perfume). The values of the root mean square error of prediction (RMSEP) and of the root mean square error of the percentage deviation (RMSPD) obtained were 0.4% (v/v) and 7.2%, respectively. Additionally, a second model was built and used to evaluate the accuracy of the method. A model to quantify the essential oil of lemon grass was built and its concentration was predicted in the validation set and real perfume samples. The RMSEP and RMSPD obtained were 0.5% (v/v) and 6.9%, respectively, and the concentration of the essential oil of lemon grass in perfume agreed to the value informed by the manufacturer. The result indicates that the MCR algorithm is adequate to resolve the target chromatogram from the complex sample and to build multivariate models of GC×GC-FID data. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Evidence-Based Practice Point-of-Care Resources: A Quantitative Evaluation of Quality, Rigor, and Content.

    PubMed

    Campbell, Jared M; Umapathysivam, Kandiah; Xue, Yifan; Lockwood, Craig

    2015-12-01

    Clinicians and other healthcare professionals need access to summaries of evidence-based information in order to provide effective care to their patients at the point-of-care. Evidence-based practice (EBP) point-of-care resources have been developed and are available online to meet this need. This study aimed to develop a comprehensive list of available EBP point-of-care resources and evaluate their processes and policies for the development of content, in order to provide a critical analysis based upon rigor, transparency and measures of editorial quality to inform healthcare providers and promote quality improvement amongst publishers of EBP resources. A comprehensive and systematic search (Pubmed, CINAHL, and Cochrane Central) was undertaken to identify available EBP point-of-care resources, defined as "web-based medical compendia specifically designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, and evidence-based information (and possibly also guidance) to clinicians." A pair of investigators independently extracted information on general characteristics, content presentation, editorial quality, evidence-based methodology, and breadth and volume. Twenty-seven summary resources were identified, of which 22 met the predefined inclusion criteria for EBP point-of-care resources, and 20 could be accessed for description and assessment. Overall, the upper quartile of EBP point-of-care providers was assessed to be UpToDate, Nursing Reference Centre, Mosby's Nursing Consult, BMJ Best Practice, and JBI COnNECT+. The choice of which EBP point-of-care resources are suitable for an organization is a decision that depends heavily on the unique requirements of that organization and the resources it has available. However, the results presented in this study should enable healthcare providers to make that assessment in a clear, evidence-based manner, and provide a comprehensive list of the available options. © 2015 Sigma Theta Tau International.

  3. Fostering Self-Regulation of Students with Learning Disabilities: Insights from 30 Years of Reading Comprehension Intervention Research

    ERIC Educational Resources Information Center

    Berkeley, Sheri; Larsen, Anna

    2018-01-01

    Thirty years of intervention research on the effects of reading comprehension strategies for students with learning disabilities was reviewed in this quantitative synthesis. Specifically, researchers targeted studies that contained self-regulated learning components. A systematic search yielded 18 studies from nine research journals that met…

  4. The Influence of Textbook Format on Postsecondary Proficient and Remedial Readers: Designing Information Using Visual Language

    ERIC Educational Resources Information Center

    Tetlan, W. Lou

    2009-01-01

    This study examined whether the design of textbook material affects comprehension and memory of textbook material under certain cognitive conditions for proficient and remedial readers. Using quantitative and qualitative research methods, format was found to significantly affect comprehension and memory. Proficient Male scored significantly…

  5. The Effects of Audiobooks on EFL Students' Listening Comprehension

    ERIC Educational Resources Information Center

    Kartal, Galip; Simsek, Harun

    2017-01-01

    This study investigates the effects of audiobooks on listening comprehension skills of EFL Students, and their attitudes towards using audiobooks in a foreign language classroom. The participants are 66 first-year students of a state university in Turkey. The research follows a pre- post-test control group research design using quantitative and…

  6. Comprehending News Videotexts: The Influence of the Visual Content

    ERIC Educational Resources Information Center

    Cross, Jeremy

    2011-01-01

    Informed by dual coding theory, this study explores the role of the visual content in L2 listeners' comprehension of news videotexts. L1 research into the visual characteristics and comprehension of news videotexts is outlined, subsequently informing the quantitative analysis of audiovisual correspondence in the news videotexts used. In each of…

  7. Qualitative and quantitative estimation of comprehensive synaptic connectivity in short- and long-term cultured rat hippocampal neurons with new analytical methods inspired by Scatchard and Hill plots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanamoto, Ryo; Shindo, Yutaka; Niwano, Mariko

    2016-03-18

    To investigate comprehensive synaptic connectivity, we examined Ca{sup 2+} responses with quantitative electric current stimulation by indium-tin-oxide (ITO) glass electrode with transparent and high electro-conductivity. The number of neurons with Ca{sup 2+} responses was low during the application of stepwise increase of electric current in short-term cultured neurons (less than 17 days in-vitro (DIV)). The neurons cultured over 17 DIV showed two-type responses: S-shaped (sigmoid) and monotonous saturated responses, and Scatchard plots well illustrated the difference of these two responses. Furthermore, sigmoid like neural network responses over 17 DIV were altered to the monotonous saturated ones by the application ofmore » the mixture of AP5 and CNQX, specific blockers of NMDA and AMPA receptors, respectively. This alternation was also characterized by the change of Hill coefficients. These findings indicate that the neural network with sigmoid-like responses has strong synergetic or cooperative synaptic connectivity via excitatory glutamate synapses. - Highlights: • We succeed to evaluate the maturation of neural network by Scathard and Hill Plots. • Long-term cultured neurons showed two-type responses: sigmoid and monotonous. • The sigmoid-like increase indicates the cooperatevity of neural networks. • Excitatory glutamate synapses cause the cooperatevity of neural networks.« less

  8. Evolutionary Glycomics: Characterization of Milk Oligosaccharides in Primates

    PubMed Central

    Tao, Nannan; Wu, Shuai; Kim, Jaehan; An, Hyun Joo; Hinde, Katie; Power, Michael L.; Gagneux, Pascal; German, J. Bruce; Lebrilla, Carlito B.

    2011-01-01

    Free oligosaccharides are abundant components of mammalian milk and have primary roles as prebiotic compounds, in immune defense, and in brain development. Mass spectrometry-based technique is applied to profile milk oligosaccharides from apes (chimpanzee, gorilla, and siamang), new world monkeys (golden lion tamarin and common marmoset), and an old world monkey (rhesus). The purpose of this study was to evaluate the patterns of primate milk oligosaccharide composition from a phylogenetic perspective in order to assess the extent to which the compositions of hMOs derives from ancestral, primate patterns as opposed to more recent evolutionary events. Milk oligosaccharides were quantitated by nanoflow liquid chromatography on chip-based devices. The relative abundances of fucosylated and sialylated milk oligosaccharides in primates were also determined. For a systematic and comprehensive study of evolutionary patterns of milk oligosaccharides, cluster analysis of primate milk was performed using the chromatographic profile. In general, the oligosaccharides in primate milk, including humans, are more complex and exhibit greater diversity compared to the ones in non-primate milk. A detailed comparison of the oligosaccharides across evolution revealed non-sequential developmental pattern, i.e. that primate milk oligosaccharides do not necessarily cluster according to the primate phylogeny. This report represents the first comprehensive and quantitative effort to profile and elucidate the structures of free milk oligosaccharides so that they can be related to glycan function in different primates. PMID:21214271

  9. Simultaneous Qualitative and Quantitative Analyses of Triterpenoids in Ilex pubescens by Ultra-High-Performance Liquid Chromatography Coupled with Quadrupole Time-of-Flight Mass Spectrometry.

    PubMed

    Cao, Di; Wang, Qing; Jin, Jing; Qiu, Maosong; Zhou, Lian; Zhou, Xinghong; Li, Hui; Zhao, Zhongxiang

    2018-03-01

    Ilex pubescens Hook et Arn mainly contains triterpenoids that possess antithrombotic, anti-inflammatory and analgesic effects. Quantitative and qualitative analyses of the triterpenoids in I. pubescens can be useful for determining the authenticity and quality of raw materials and guiding its clinical preparation. To establish a method for rapid and comprehensive analysis of triterpenoids in I. pubescens using ultra-high-performance liquid chromatography coupled to electrospray ionisation and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS), which will also be applied to evaluate the contents of nine triterpenoids among root, root heartwood and root bark of I. pubescens to judge the value of the root bark to avoid wastage. UPLC-ESI-QTOF-MS data from the extracts of I. pubescens in negative mode were analysed using Peakview and Masterview software that provided molecular weight, mass errors, isotope pattern fit and MS/MS fragments for the identification of triterpenoids. The quantification of nine investigated compounds of I. pubescens was accomplished using MultiQuant software. A total of 33 triterpenoids, five phenolic acids, two lignans and a flavonol were characterised in only 14 min. The total content of the nine compounds in the root bark was generally slightly higher than that of the root and root heartwood, which has not been reported before. The developed UPLC-ESI-QTOF-MS method was proven to be rapid and comprehensive for simultaneous qualitative and quantitative analyses of the characteristic triterpenoids in I. pubescens. The results may provide a basis for holistic quality control and metabolic studies of I. pubescens, as well as serve as a reference for the analysis of other Ilex plants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Quantitative OCT and MRI biomarkers for the differentiation of cartilage degeneration.

    PubMed

    Nebelung, Sven; Brill, Nicolai; Tingart, Markus; Pufe, Thomas; Kuhl, Christiane; Jahr, Holger; Truhn, Daniel

    2016-04-01

    To evaluate the usefulness of quantitative parameters obtained by optical coherence tomography (OCT) and magnetic resonance imaging (MRI) in the comprehensive assessment of human articular cartilage degeneration. Human osteochondral samples of variable degeneration (n = 45) were obtained from total knee replacements and assessed by MRI sequences measuring T1, T1ρ, T2 and T2* relaxivity and by OCT-based quantification of irregularity (OII, optical irregularity index), homogeneity (OHI, optical homogeneity index]) and attenuation (OAI, optical attenuation index]). Samples were also assessed macroscopically (Outerbridge classification) and histologically (Mankin classification) as grade-0 (Mankin scores 0-4)/grade-I (scores 5-8)/grade-II (scores 9-10)/grade-III (score 11-14). After data normalisation, differences between Mankin grades and correlations between imaging parameters were assessed using ANOVA and Tukey's post-hoc test and Spearman's correlation coefficients, respectively. Sensitivities and specificities in the detection of Mankin grade-0 were calculated. Significant degeneration-related increases were found for T2 and OII and decreases for OAI, while T1, T1ρ, T2* or OHI did not reveal significant changes in relation to degeneration. A number of significant correlations between imaging parameters and histological (sub)scores were found, in particular for T2 and OII. Sensitivities and specificities in the detection of Mankin grade-0 were highest for OHI/T1 and OII/T1ρ, respectively. Quantitative OCT and MRI techniques seem to complement each other in the comprehensive assessment of cartilage degeneration. Sufficiently large structural and compositional changes in the extracellular matrix may thus be parameterized and quantified, while the detection of early degeneration remains challenging.

  11. Recommended health and safety guidelines for coal gasification pilot plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-01-01

    The greatest toxic hazards in coal conversion are presented by the known and suspected carcinogens, because they are hazardous at low levels of exposure, have delayed action with no immediate warning, and have grave consequences. As for routes of entry, it is to be noted that various solids and liquids may reach the body by inhalation of particles, deposition of particles, or indirectly by contact with dirty surfaces. Other toxicants are most likely to enter the body by inhalation. The overall carcinogenic hazard cannot be precisely estimated from chemical analysis alone, because the possible interactions are far too complex. Further,more » the hazard cannot at present be quantitatively defined by available biological tests. The same limitations probably apply to toxic effects other than carcinogenesis, with the posible exception of some immediate responses (e.g., chemical asphyxia, primary respiratory irration). It is not practical to recommend comprehensive workplace exposure limits on a basis similar to those for individual toxicants; however, a limit for one important kind of hazard (high-boiling suspected carcinogens) can be recommended. The carcinogenic hazards associated with airborne particles and surface contamination are the most crucial of the whole spectrum and offer a practical target for control, if not for quantitative evaluation. The only direct quantitative evidence now availabl is from epidemiology in analogous circulstances and there are severe limitations on the comprehensiveness and reliability of such evidence. Some specific targets for control through industrial hygiene practices can be identified. The presence of any strong irritant of the respiratory mucosa, other mucous surfaces, and the skin should be regarded as a danger signal because of possible potentiation of carcinogens and other toxicants.« less

  12. Evaluation of bacterial pathogen diversity, abundance and health risks in urban recreational water by amplicon next-generation sequencing and quantitative PCR.

    PubMed

    Cui, Qijia; Fang, Tingting; Huang, Yong; Dong, Peiyan; Wang, Hui

    2017-07-01

    The microbial quality of urban recreational water is of great concern to public health. The monitoring of indicator organisms and several pathogens alone is not sufficient to accurately and comprehensively identify microbial risks. To assess the levels of bacterial pathogens and health risks in urban recreational water, we analyzed pathogen diversity and quantified four pathogens in 46 water samples collected from waterbodies in Beijing Olympic Forest Park in one year. The pathogen diversity revealed by 16S rRNA gene targeted next-generation sequencing (NGS) showed that 16 of 40 genera and 13 of 76 reference species were present. The most abundant species were Acinetobacter johnsonii, Mycobacterium avium and Aeromonas spp. Quantitative polymerase chain reaction (qPCR) of Escherichia coli (uidA), Aeromonas (aerA), M. avium (16S rRNA), Pseudomonas aeruginosa (oaa) and Salmonella (invA) showed that the aerA genes were the most abundant, occurring in all samples with concentrations of 10 4-6 genome copies/100mL, followed by oaa, invA and M. avium. In total, 34.8% of the samples harbored all genes, indicating the prevalence of these pathogens in this recreational waterbody. Based on the qPCR results, a quantitative microbial risk assessment (QMRA) showed that the annual infection risks of Salmonella, M. avium and P. aeruginosa in five activities were mostly greater than the U.S. EPA risk limit for recreational contacts, and children playing with water may be exposed to the greatest infection risk. Our findings provide a comprehensive understanding of bacterial pathogen diversity and pathogen abundance in urban recreational water by applying both NGS and qPCR. Copyright © 2016. Published by Elsevier B.V.

  13. Shear-wave elastography quantitative assessment of the male breast: added value to distinguish benign and malignant palpable masses.

    PubMed

    Crombé, Amandine; Hurtevent-Labrot, Gabrielle; Asad-Syed, Maryam; Palussière, Jean; MacGrogan, Gaetan; Kind, Michèle; Ferron, Stéphane

    2018-02-01

    To evaluate the ability of shear-wave elastography (SWE) to distinguish between benign and malignant palpable masses of the adult male breast. Clinical examination, mammography, B-mode and Doppler ultrasound findings and SWE quantitative parameters were compared in 50 benign lesions (including 40 gynaecomastias) and 15 malignant lesions (invasive ductal carcinomas) from 65 patients who were consecutively addressed for specialized advice at our comprehensive cancer centre. Mean elasticity (El mean), maximum elasticity (El max), El mean of the surrounding fatty tissue and lesion to fat ratio (El ratio) were reported for each patient. Malignant masses displayed significantly higher El mean (p < 0.0001), El max (p < 0.0001) and El ratio (p < 0.0001) compared to benign masses without overlap of values between the two groups. By adding SWE to clinical examination, mammography and ultrasound, all the lesions would have been retrospectively correctly diagnosed as benign or malignant. One false positive could have been downstaged, 14/65 undetermined masses could have been correctly reclassified as 4 malignant and 10 benign lesions, for which biopsies could have consequently been avoided. Evaluation of male breast palpable masses by SWE demonstrates that malignant masses are significantly stiffer lesions and may improve diagnostic management when clinical examination, mammography and conventional ultrasound are doubtful. Advances in knowledge: Quantitative SWE is feasible in male breast and could be of great interest to help classify doubtful lesions after classical clinical and radiological evaluations, probably because of different anatomy and different tumours epidemiology compared with female breast.

  14. Quantitative characterization of solid epoxy resins using comprehensive two dimensional liquid chromatography coupled with electrospray ionization-time of flight mass spectrometry.

    PubMed

    Julka, Samir; Cortes, Hernan; Harfmann, Robert; Bell, Bruce; Schweizer-Theobaldt, Andreas; Pursch, Matthias; Mondello, Luigi; Maynard, Shawn; West, David

    2009-06-01

    A comprehensive multidimensional liquid chromatography system coupled to Electrospray Ionization-Mass Spectrometry (LCxLC-ESI-MS) was developed for detailed characterization and quantitation of solid epoxy resin components. The two orthogonal modes of separation selected were size exclusion chromatography (SEC) in the first dimension and liquid chromatography at critical conditions (LCCC) in the second dimension. Different components present in the solid epoxy resins were separated and quantitated for the first time based on the functional groups and molecular weight heterogeneity. Coupling LCxLC separations with mass spectrometry enabled the identification of components resolved in the two-dimensional space. Several different functional group families of compounds were separated and identified, including epoxy-epoxy and epoxy-alpha-glycol functional oligomers, and their individual molecular weight ranges were determined. Repeatability obtained ranged from 0.5% for the main product to 21% for oligomers at the 0.4% concentration level.

  15. Automated volumetric segmentation of retinal fluid on optical coherence tomography

    PubMed Central

    Wang, Jie; Zhang, Miao; Pechauer, Alex D.; Liu, Liang; Hwang, Thomas S.; Wilson, David J.; Li, Dengwang; Jia, Yali

    2016-01-01

    We propose a novel automated volumetric segmentation method to detect and quantify retinal fluid on optical coherence tomography (OCT). The fuzzy level set method was introduced for identifying the boundaries of fluid filled regions on B-scans (x and y-axes) and C-scans (z-axis). The boundaries identified from three types of scans were combined to generate a comprehensive volumetric segmentation of retinal fluid. Then, artefactual fluid regions were removed using morphological characteristics and by identifying vascular shadowing with OCT angiography obtained from the same scan. The accuracy of retinal fluid detection and quantification was evaluated on 10 eyes with diabetic macular edema. Automated segmentation had good agreement with manual segmentation qualitatively and quantitatively. The fluid map can be integrated with OCT angiogram for intuitive clinical evaluation. PMID:27446676

  16. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  17. The training for health equity network evaluation framework: a pilot study at five health professional schools.

    PubMed

    Ross, Simone J; Preston, Robyn; Lindemann, Iris C; Matte, Marie C; Samson, Rex; Tandinco, Filedito D; Larkins, Sarah L; Palsdottir, Bjorg; Neusy, Andre-Jacques

    2014-01-01

    The Training for Health Equity Network (THEnet), a group of diverse health professional schools aspiring toward social accountability, developed and pilot tested a comprehensive evaluation framework to assess progress toward socially accountable health professions education. The evaluation framework provides criteria for schools to assess their level of social accountability within their organization and planning; education, research and service delivery; and the direct and indirect impacts of the school and its graduates, on the community and health system. This paper describes the pilot implementation of testing the evaluation framework across five THEnet schools, and examines whether the evaluation framework was practical and feasible across contexts for the purposes of critical reflection and continuous improvement in terms of progress towards social accountability. In this pilot study, schools utilized the evaluation framework using a mixed method approach of data collection comprising of workshops, qualitative interviews and focus group discussions, document review and collation and analysis of existing quantitative data. The evaluation framework allowed each school to contextually gather evidence on how it was meeting the aspirational goals of social accountability across a range of school activities, and to identify strengths and areas for improvement and development. The evaluation framework pilot study demonstrated how social accountability can be assessed through a critically reflective and comprehensive process. As social accountability focuses on the relationship between health professions schools and health system and health population outcomes, each school was able to demonstrate to students, health professionals, governments, accrediting bodies, communities and other stakeholders how current and future health care needs of populations are addressed in terms of education, research, and service learning.

  18. The Basic Shelf Experience: a comprehensive evaluation.

    PubMed

    Dewolfe, Judith A; Greaves, Gaye

    2003-01-01

    The Basic Shelf Experience is a program designed to assist people living on limited incomes to make better use of their food resources. The purpose of this research was to learn if the Basic Shelf Experience program helps such people to 1. utilize food resources more effectively and 2. cope, through group support, with poverty-associated stressors that influence food security. Both quantitative and qualitative methods were used to evaluate the program objectives. Participants completed a questionnaire at the beginning and end of the six-week program. The questionnaire asked about their food access, food security, and feelings about themselves. Participants returned for a focus group discussion and completed the questionnaire again three months after the program ended. The focus group was designed to elicit information about perceived changes, if any, attributed to the program. Forty-two people completed the questionnaires pre-program and 20 post-program; 17 participated in the three-month follow-up session. While results from quantitative data analysis indicate that program objectives were not met, qualitative data provide evidence that the program did achieve its stated objectives. Our results suggest such programs as the Basic Shelf Experience can assist people living on limited incomes to achieve food security.

  19. Method for detecting damage in carbon-fibre reinforced plastic-steel structures based on eddy current pulsed thermography

    NASA Astrophysics Data System (ADS)

    Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol

    2018-01-01

    Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.

  20. Quantitative multi-modal NDT data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less

  1. A model of comprehensive unification

    NASA Astrophysics Data System (ADS)

    Reig, Mario; Valle, José W. F.; Vaquera-Araujo, C. A.; Wilczek, Frank

    2017-11-01

    Comprehensive - that is, gauge and family - unification using spinors has many attractive features, but it has been challenged to explain chirality. Here, by combining an orbifold construction with more traditional ideas, we address that difficulty. Our candidate model features three chiral families and leads to an acceptable result for quantitative unification of couplings. A potential target for accelerator and astronomical searches emerges.

  2. Exploring K-3 Teachers' Implementation of Comprehension Strategy Instruction (CSI) Using Expectancy-Value Theory

    ERIC Educational Resources Information Center

    Foley, Laura S.

    2011-01-01

    This research investigated factors that influence the implementation levels of evidence-based comprehension strategy instruction (CSI) among K-3 teachers. An explanatory design was chosen to gather and probe the data. Quantitative data were gathered via a mailed survey distributed through a representative sample of the 40 school districts (through…

  3. A Case Study of the Impact of Guided Reading Groups in Second Grade on Comprehension Improvement

    ERIC Educational Resources Information Center

    Lorent Deegan, Chanin E.

    2010-01-01

    This study combined both qualitative and quantitative research to determine the impact of instructional practices on comprehension improvement in second grade Guided Reading groups. Four second grade teachers and their 73 students ages seven through eight years old participated in this study. Additionally, the study examined the effects of Guided…

  4. Does mClass Reading 3D Predict Student Reading Proficiency on High-Stakes Assessments?

    ERIC Educational Resources Information Center

    Bowles, Amy S.

    2015-01-01

    This quantitative, correlational study investigated the relationship between the North Carolina End of Grade Assessment of Reading Comprehension (NCEOG) and mClass Reading 3D assessment in a North Carolina elementary school. It especially examined the degree to which mClass Reading 3D measures predict scores on the reading comprehension portion of…

  5. Selection of representative embankments based on rough set - fuzzy clustering method

    NASA Astrophysics Data System (ADS)

    Bin, Ou; Lin, Zhi-xiang; Fu, Shu-yan; Gao, Sheng-song

    2018-02-01

    The premise condition of comprehensive evaluation of embankment safety is selection of representative unit embankment, on the basis of dividing the unit levee the influencing factors and classification of the unit embankment are drafted.Based on the rough set-fuzzy clustering, the influence factors of the unit embankment are measured by quantitative and qualitative indexes.Construct to fuzzy similarity matrix of standard embankment then calculate fuzzy equivalent matrix of fuzzy similarity matrix by square method. By setting the threshold of the fuzzy equivalence matrix, the unit embankment is clustered, and the representative unit embankment is selected from the classification of the embankment.

  6. Impact of implementing an Internal Bed Regulation Committee on administrative and care indicators at a teaching hospital.

    PubMed

    Rodrigues, Luciane Cristine Ribeiro; Juliani, Carmen Maria Casquel Monti

    2015-01-01

    To compare hospital indicators before and after implementing an Internal Bed Regulation Committee at a reference hospital. It is an quantitative, evaluation, exploratory, descriptive and cross-sectional research. The data was gathered from the hospital administrative reports for the period 2008-2013, provided by the Information Technology Center of the Complexo FAMEMA. The indicators improved after implementation of the Internal Bed Regulation Committee. The individuals involved in the process acknowledged the improvement. It is necessary to carry on the regulatory actions, especially in a comprehensive and complex healthcare system, such as the brazilian Sistema Único de Saúde.

  7. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less

  8. Monitoring and Assessment of Youshui River Water Quality in Youyang

    NASA Astrophysics Data System (ADS)

    Wang, Xue-qin; Wen, Juan; Chen, Ping-hua; Liu, Na-na

    2018-02-01

    By monitoring the water quality of Youshui River from January 2016 to December 2016, according to the indicator grading and the assessment standard of water quality, the formulas for 3 types water quality indexes are established. These 3 types water quality indexes, the single indicator index Ai, single moment index Ak and the comprehensive water quality index A, were used to quantitatively evaluate the quality of single indicator, the water quality and the change of water quality with time. The results show that, both total phosphorus and fecal coliform indicators exceeded the standard, while the other 16 indicators measured up to the standard. The water quality index of Youshui River is 0.93 and the grade of water quality comprehensive assessment is level 2, which indicated that the water quality of Youshui River is good, and there is room for further improvement. To this end, several protection measures for Youshui River environmental management and pollution treatment are proposed.

  9. Use of the conditional marketing authorization pathway for oncology medicines in Europe.

    PubMed

    Hoekman, J; Boon, W P C; Bouvy, J C; Ebbers, H C; de Jong, J P; De Bruin, M L

    2015-11-01

    Conditional marketing authorization (CMA) in the European Union (EU) is an early access pathway for medicines that show promising therapeutic effects, but for which comprehensive data are not available. Using a mixed quantitative-qualitative research design, we evaluated how CMA has been used in marketing authorization of oncology medicines in the period 2006 to 2013. We show that compared to full marketing authorization, CMA is granted based on less comprehensive data. However, this is accompanied by significantly longer assessment times and less consensus among regulators about marketing authorization. Moreover, development time from first-in-human testing to marketing authorization did not differ between full marketing authorization and CMA, but was significantly longer for CMA compared to accelerated approved products in the United States (US). Results indicate that CMA is not used by companies as a prospectively planned pathway to obtain early access, but as a "rescue option" when submitted data are not strong enough to justify full marketing authorization. © 2015, The American Society for Clinical Pharmacology and Therapeutics.

  10. Improving text comprehension strategies in upper primary school children: a design experiment.

    PubMed

    De Corte, E; Verschaffel, L; Van De Ven, A

    2001-12-01

    With respect to the acquisition of competence in reading, new standards for primary education stress more than before the importance of learning and teaching cognitive and metacognitive strategies that facilitate text comprehension. Therefore, there is a need to design a research-based instructional approach to strategic reading comprehension. The design experiment aimed at developing, implementing and evaluating a research-based, but also practically applicable learning environment for enhancing skilled strategy use in upper primary school children when reading a text. Four text comprehension strategies (activating prior knowledge, clarifying difficult words, making a schematic representation of the text, and formulating the main idea) and a metacognitive strategy (regulating one's own reading process) were trained through a variety of highly interactive instructional techniques, namely modelling, whole class discussion, and small group work in the format of reciprocal teaching. Participants in the study were four experimental 5th grade classes (79 children) and eight comparable control classes (149 pupils). The effects of the learning environment were measured using a pretest-post-test-retention design. Multilevel hierarchical linear regression models were used to analyse the quantitative data of a Reading Strategy Test, a standardised Reading Comprehension Test, a Reading Attitude Scale, a Transfer Test and an interview about strategy use during reading. The data of the Reading Strategy Test, the Transfer Test and the interviews about strategy use showed that the experimental group out-performed the control group in terms of the strategy adoption and application during text reading. Whilst the experimental group also scored higher on the Reading Comprehension Test than the control group, the difference was not significant. This design experiment shows that it is possible to foster pupils' use and transfer of strategic reading comprehension skills in regular classrooms by immersing them in a powerful learning environment. But this intervention does not automatically result in improvement of performance on a standardised reading comprehension test.

  11. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.

  12. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.

  13. [Grades evaluation of Scutellariae Radix slices based on quality constant].

    PubMed

    Deng, Zhe; Zhang, Jun; Jiao, Meng-Jiao; Zhong, Wen; Cui, Wen-Jin; Cheng, Jin-Tang; Chen, Sha; Wang, Yue-Sheng; Liu, An

    2017-05-01

    By measuring the morphological indexes and the marker components content of 22 batches of Scutellariae Radix slices as well as calculating the quality constant, this research was aimed to establish a new method of evaluating the specifications and grades of Scutellariae Radix slices. The quality constants of these samples were in the range of 0.04-0.49, which can be divided into several grades based on the real requirement. If they were divided into three grades, the quality constant was ≥0.39 for the first grade, <0.39 but ≥0.24 for the second grade, and <0.24 for the third grade. This work indicated that the quality constants characterizing both apparent parameters and intrinsic quality can be used as a comprehensive evaluation index to classify the grades of traditional Chinese medicine quantitatively, clearly and objectively. The research results in this paper would provide new ideas and references for evaluating the specifications and grades of traditional Chinese medicines. Copyright© by the Chinese Pharmaceutical Association.

  14. Application of life cycle assessment for an evaluation of wastewater treatment and reuse project--case study of Xi'an, China.

    PubMed

    Zhang, Q H; Wang, X C; Xiong, J Q; Chen, R; Cao, B

    2010-03-01

    In order to illuminate the benefit of a wastewater treatment and reuse project, a life cycle assessment (LCA) model was proposed by combining the process-based LCA and the input-output based LCA in one framework and using energy consumption as the sole parameter for quantitative evaluation of the project. The life cycle consumption was evaluated mainly by life cycle inventory (LCI) analysis taking into account the construction phase, operation phase and demolishment phase of the project. For evaluating the life cycle benefit of treated water reuse, attention was paid to the decrease of secondary effluent discharge and water saving. As a result of comprehensive LCA analysis of a case project in Xi'an, China, it was understood that the life cycle benefit gained from treated wastewater reuse much surpassed the life cycle energy consumption. The advantage of wastewater treatment and reuse was well shown by LCA analysis using the proposed model. 2009 Elsevier Ltd. All rights reserved.

  15. [Data fusion and multi-components quantitative analysis for identification and quality evaluation of Gentiana rigescens from different geographical origins].

    PubMed

    Wang, Qin-Qin; Shen, Tao; Zuo, Zhi-Tian; Huang, Heng-Yu; Wang, Yuan-Zhong

    2018-03-01

    The accumulation of secondary metabolites of traditional Chinese medicine (TCM) is closely related to its origins. The identification of origins and multi-components quantitative evaluation are of great significance to ensure the quality of medicinal materials. In this study, the identification of Gentiana rigescens from different geographical origins was conducted by data fusion of Fourier transform infrared (FTIR) spectroscopy and high performance liquid chromatography (HPLC) in combination of partial least squares discriminant analysis; meanwhile quantitative analysis of index components was conducted to provide an accurate and comprehensive identification and quality evaluation strategy for selecting the best production areas of G. rigescens. In this study, the FTIR and HPLC information of 169 G. rigescens samples from Yunnan, Sichuan, Guangxi and Guizhou Provinces were collected. The raw infrared spectra were pre-treated by multiplicative scatter correction, standard normal variate (SNV) and Savitzky-Golay (SG) derivative. Then the performances of FTIR, HPLC, and low-level data fusion and mid-level data fusion for identification were compared, and the contents of gentiopicroside, swertiamarin, loganic acid and sweroside were determined by HPLC. The results showed that the FTIR spectra of G. rigescens from different geographical origins were different, and the best pre-treatment method was SNV+SG-derivative (second derivative, 15 as the window parameter, and 2 as the polynomial order). The results showed that the accuracy rate of low- and mid-level data fusion (96.43%) in prediction set was higher than that of FTIR and HPLC (94.64%) in prediction set. In addition, the accuracy of low-level data fusion (100%) in the training set was higher than that of mid-level data fusion (99.12%) in training set. The contents of the iridoid glycosides in Yunnan were the highest among different provinces. The average content of gentiopicroside, as a bioactive marker in Chinese pharmacopoeia, was 47.40 mg·g⁻¹, and the maximum was 79.83 mg·g⁻¹. The contents of loganic acid, sweroside and gentiopicroside in Yunnan were significantly different from other provinces ( P <0.05). In comparison of total content of iridoid glycosides in G. rigescens with different geographical origins in Yunnan, it was found that the amount of iridoid glycosides was higher in Eryuan Dali (68.59 mg·g⁻¹) and Yulong Lijiang (66.68 mg·g⁻¹), significantly higher than that in Wuding Chuxiong (52.99 mg·g⁻¹), Chengjiang Yuxi (52.29 mg·g⁻¹) and Xundian Kunming (46.71 mg·g⁻¹) ( P <0.05), so these two places can be used as a reference region for screening cultivation and excellent germplasm resources of G. rigescens. A comprehensive and accurate method was established by data fusion of HPLC-FTIR and quantitative analysis of HPLC for identification and quality evaluation of G. rigescens, which could provide a support for the development and utilization of G. rigescens. Copyright© by the Chinese Pharmaceutical Association.

  16. Filipino ESL Learners' Attitudes toward Cooperative Learning and Their Relationship to Reading Comprehension

    ERIC Educational Resources Information Center

    Gonzales, Wilkinson Daniel Wong; Torres, Patrisha Lliane O.

    2016-01-01

    Departing from Gonzales and Torres' (2015) quantitative quasi-experimental study on the effectiveness of cooperative learning (CL) on Grade 8 students in a private school in Manila, the present study attempts to uncover possible findings in the affective dimension through a 25-item survey questionnaire that quantitatively gauges the Grade 8…

  17. Hypermedia as an Educational Technology: A Review of the Quantitative Research Literature on Learner Comprehension, Control, and Style.

    ERIC Educational Resources Information Center

    Dillon, Andrew; Gabbard, Ralph

    1998-01-01

    Published studies of hypermedia as an educational technology that have emphasized quantitative and empirical assessment of outcomes were reviewed. The evidence to date suggests that hypermedia benefits in education are limited to learning tasks depending on repeated manipulation and searching of information. Implications for instruction and…

  18. Developing Quantitative Reasoning: Will Taking Traditional Math Courses Suffice? An Empirical Study

    ERIC Educational Resources Information Center

    Agustin, Ma Zenia; Agustin, Marcus; Brunkow, Paul; Thomas, Susan

    2012-01-01

    Southern Illinois University Edwardsville (SIUE) is a postbaccalaureate comprehensive university in the Midwest. In 2005, SIUE embarked on a challenging journey of general education reform. A review of the current general education program revealed that it is possible for a student to graduate from SIUE without taking a quantitative course. Hence,…

  19. Combinatorial modification of human histone H4 quantitated by two-dimensional liquid chromatography coupled with top down mass spectrometry.

    PubMed

    Pesavento, James J; Bullock, Courtney R; LeDuc, Richard D; Mizzen, Craig A; Kelleher, Neil L

    2008-05-30

    Quantitative proteomics has focused heavily on correlating protein abundances, ratios, and dynamics by developing methods that are protein expression-centric (e.g. isotope coded affinity tag, isobaric tag for relative and absolute quantification, etc.). These methods effectively detect changes in protein abundance but fail to provide a comprehensive perspective of the diversity of proteins such as histones, which are regulated by post-translational modifications. Here, we report the characterization of modified forms of HeLa cell histone H4 with a dynamic range >10(4) using a strictly Top Down mass spectrometric approach coupled with two dimensions of liquid chromatography. This enhanced dynamic range enabled the precise characterization and quantitation of 42 forms uniquely modified by combinations of methylation and acetylation, including those with trimethylated Lys-20, monomethylated Arg-3, and the novel dimethylated Arg-3 (each <1% of all H4 forms). Quantitative analyses revealed distinct trends in acetylation site occupancy depending on Lys-20 methylation state. Because both modifications are dynamically regulated through the cell cycle, we simultaneously investigated acetylation and methylation kinetics through three cell cycle phases and used these data to statistically assess the robustness of our quantitative analysis. This work represents the most comprehensive analysis of histone H4 forms present in human cells reported to date.

  20. Evaluating the More Suitable ISM Frequency Band for IoT-Based Smart Grids: A Quantitative Study of 915 MHz vs. 2400 MHz.

    PubMed

    Sandoval, Ruben M; Garcia-Sanchez, Antonio-Javier; Garcia-Sanchez, Felipe; Garcia-Haro, Joan

    2016-12-31

    IoT has begun to be employed pervasively in industrial environments and critical infrastructures thanks to its positive impact on performance and efficiency. Among these environments, the Smart Grid (SG) excels as the perfect host for this technology, mainly due to its potential to become the motor of the rest of electrically-dependent infrastructures. To make this SG-oriented IoT cost-effective, most deployments employ unlicensed ISM bands, specifically the 2400 MHz one, due to its extended communication bandwidth in comparison with lower bands. This band has been extensively used for years by Wireless Sensor Networks (WSN) and Mobile Ad-hoc Networks (MANET), from which the IoT technologically inherits. However, this work questions and evaluates the suitability of such a "default" communication band in SG environments, compared with the 915 MHz ISM band. A comprehensive quantitative comparison of these bands has been accomplished in terms of: power consumption, average network delay, and packet reception rate. To allow such a study, a dual-band propagation model specifically designed for the SG has been derived, tested, and incorporated into the well-known TOSSIM simulator. Simulation results reveal that only in the absence of other 2400 MHz interfering devices (such as WiFi or Bluetooth) or in small networks, is the 2400 MHz band the best option. In any other case, SG-oriented IoT quantitatively perform better if operating in the 915 MHz band.

  1. Evaluating the More Suitable ISM Frequency Band for IoT-Based Smart Grids: A Quantitative Study of 915 MHz vs. 2400 MHz

    PubMed Central

    Sandoval, Ruben M.; Garcia-Sanchez, Antonio-Javier; Garcia-Sanchez, Felipe; Garcia-Haro, Joan

    2016-01-01

    IoT has begun to be employed pervasively in industrial environments and critical infrastructures thanks to its positive impact on performance and efficiency. Among these environments, the Smart Grid (SG) excels as the perfect host for this technology, mainly due to its potential to become the motor of the rest of electrically-dependent infrastructures. To make this SG-oriented IoT cost-effective, most deployments employ unlicensed ISM bands, specifically the 2400 MHz one, due to its extended communication bandwidth in comparison with lower bands. This band has been extensively used for years by Wireless Sensor Networks (WSN) and Mobile Ad-hoc Networks (MANET), from which the IoT technologically inherits. However, this work questions and evaluates the suitability of such a “default” communication band in SG environments, compared with the 915 MHz ISM band. A comprehensive quantitative comparison of these bands has been accomplished in terms of: power consumption, average network delay, and packet reception rate. To allow such a study, a dual-band propagation model specifically designed for the SG has been derived, tested, and incorporated into the well-known TOSSIM simulator. Simulation results reveal that only in the absence of other 2400 MHz interfering devices (such as WiFi or Bluetooth) or in small networks, is the 2400 MHz band the best option. In any other case, SG-oriented IoT quantitatively perform better if operating in the 915 MHz band. PMID:28042863

  2. Comprehensive evaluation of an image segmentation technique for measuring tumor volume from CT images

    NASA Astrophysics Data System (ADS)

    Deng, Xiang; Huang, Haibin; Zhu, Lei; Du, Guangwei; Xu, Xiaodong; Sun, Yiyong; Xu, Chenyang; Jolly, Marie-Pierre; Chen, Jiuhong; Xiao, Jie; Merges, Reto; Suehling, Michael; Rinck, Daniel; Song, Lan; Jin, Zhengyu; Jiang, Zhaoxia; Wu, Bin; Wang, Xiaohong; Zhang, Shuai; Peng, Weijun

    2008-03-01

    Comprehensive quantitative evaluation of tumor segmentation technique on large scale clinical data sets is crucial for routine clinical use of CT based tumor volumetry for cancer diagnosis and treatment response evaluation. In this paper, we present a systematic validation study of a semi-automatic image segmentation technique for measuring tumor volume from CT images. The segmentation algorithm was tested using clinical data of 200 tumors in 107 patients with liver, lung, lymphoma and other types of cancer. The performance was evaluated using both accuracy and reproducibility. The accuracy was assessed using 7 commonly used metrics that can provide complementary information regarding the quality of the segmentation results. The reproducibility was measured by the variation of the volume measurements from 10 independent segmentations. The effect of disease type, lesion size and slice thickness of image data on the accuracy measures were also analyzed. Our results demonstrate that the tumor segmentation algorithm showed good correlation with ground truth for all four lesion types (r = 0.97, 0.99, 0.97, 0.98, p < 0.0001 for liver, lung, lymphoma and other respectively). The segmentation algorithm can produce relatively reproducible volume measurements on all lesion types (coefficient of variation in the range of 10-20%). Our results show that the algorithm is insensitive to lesion size (coefficient of determination close to 0) and slice thickness of image data(p > 0.90). The validation framework used in this study has the potential to facilitate the development of new tumor segmentation algorithms and assist large scale evaluation of segmentation techniques for other clinical applications.

  3. Effect of Strategy Instruction on Fourth-Grade Dual Language Learners' Ability to Monitor Their Comprehension of Scientific Texts

    ERIC Educational Resources Information Center

    Martinez-Alvarez, Patricia; Bannan, Brenda; Peters-Burton, Erin E.

    2012-01-01

    This research work examined the impact of a science-reading instructional sequence ("InSciRead") on fourth-grade Spanish-English dual language learners' ability to monitor their comprehension of content-related (erosion, deposition, and transportation) texts. We used a quantitative measure of students' ability to detect incongruities in a related…

  4. The Impact of Authentic Material Use on Development of the Reading Comprehension, Writing Skills and Motivation in Language Course

    ERIC Educational Resources Information Center

    Belet Boyaci, S. Dilek; Güner, Mediha

    2018-01-01

    The objective of the present study was to determine the impact of authentic task-based authentic material on reading comprehension, writing skills and writing motivation in the Turkish language course. The study was conducted with mixed design methodology. Quantitative data were collected with the quasi-experimental with pre-test post-test with…

  5. The Effect of Instructing Critical Thinking through Debate on Male and Female EFL Learners' Reading Comprehension

    ERIC Educational Resources Information Center

    Tous, Maryam Danaye; Tahriri, Abdorreza; Haghighi, Sara

    2015-01-01

    The purpose of the present study was to examine the effect of instruction through debate on male and female EFL learners' reading comprehension. Also, their perception of critical thinking (CT) instruction was investigated. A quantitative research method with experimental pre-and post-tests design was conducted to collect the data. Eighty-eight…

  6. The Effects of Explicit Reading Strategy Instruction and Cooperative Learning on Reading Comprehension in Fourth Grade Students

    ERIC Educational Resources Information Center

    Lencioni, Gina M.

    2013-01-01

    The purpose of this study was to investigate the effects of explicit direct instruction and cooperative learning on reading comprehension in fourth grade students. A quasi-experimental design was used. There were six cognitive and three affective measures used to collect quantitative data. Cognitive measures included California State Test scores,…

  7. Effect of an Interactive Literacy Website on Struggling Readers' Comprehension

    ERIC Educational Resources Information Center

    Caldwell, Bettie Johnson

    2013-01-01

    The issue of struggling readers is a major concern in the field of education. An overwhelming number of students are beginning school ill-equipped to read. The purpose of this quantitative study was to explore the effect of an interactive literacy website on the reading comprehension scores of 2nd grade students identified as struggling readers.…

  8. A Quantitative, Correlational Study of the Relationship between Comprehensive School Counseling Program Implementation and School Counselors' Perceived Leadership Styles in Title I Schools

    ERIC Educational Resources Information Center

    Harris-Bingham, Tranae

    2017-01-01

    The purpose of this study is to examine the relationships among the level of comprehensive school counseling program implementation, school counselors' leadership practices, and their schools' Title I status. Existing research examining the level of program implementation and school counselor leadership practices is limited and the representation…

  9. The application of quantitative methods for identifying and exploring the presence of bias in systematic reviews: PDE-5 inhibitors for erectile dysfunction.

    PubMed

    Bekkering, G E; Abou-Setta, A M; Kleijnen, J

    2008-01-01

    A systematic review of PDE-5 inhibitors for erectile dysfunction was performed to evaluate the utility of quantitative methods for identifying and exploring the influence of bias and study quality on pooled outcomes from meta-analyses. We included 123 randomized controlled trials (RCTs). Methodological quality was poorly reported. All three drugs appeared highly effective. Indirect adjusted analyses showed no differences between the three drugs. Funnel plots and statistical tests showed no evidence of small-study effects for sildenafil whereas there was evidence of such bias for tadalafil and vardenafil. Adjustment for missing studies using trim and fill techniques did not alter the pooled estimates substantially. The exclusion of previous sildenafil nonresponders was associated with larger treatment effects for tadalafil. This investigation was hampered by poor reporting of methodological quality, a low number of studies, heterogeneity and large effect sizes. Despite such limitations, a comprehensive assessment of biases should be a routine in systematic reviews.

  10. Study on Quality Standard of Processed Curcuma Longa Radix

    PubMed Central

    Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo

    2017-01-01

    To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640

  11. Four-hour quantitative real-time polymerase chain reaction-based comprehensive chromosome screening and accumulating evidence of accuracy, safety, predictive value, and clinical efficacy.

    PubMed

    Treff, Nathan R; Scott, Richard T

    2013-03-15

    Embryonic comprehensive chromosomal euploidy may represent a powerful biomarker to improve the success of IVF. However, there are a number of aneuploidy screening strategies to consider, including different technologic platforms with which to interrogate the embryonic DNA, and different embryonic developmental stages from which DNA can be analyzed. Although there are advantages and disadvantages associated with each strategy, a series of experiments producing evidence of accuracy, safety, clinical predictive value, and clinical efficacy indicate that trophectoderm biopsy and quantitative real-time polymerase chain reaction (qPCR)-based comprehensive chromosome screening (CCS) may represent a useful strategy to improve the success of IVF. This Biomarkers in Reproductive Medicine special issue review summarizes the accumulated experience with the development and clinical application of a 4-hour blastocyst qPCR-based CCS technology. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  12. Generation of High-Quality SWATH® Acquisition Data for Label-free Quantitative Proteomics Studies Using TripleTOF® Mass Spectrometers

    PubMed Central

    Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.

    2017-01-01

    Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533

  13. Improving pilot mental workload evaluation with combined measures.

    PubMed

    Wanyan, Xiaoru; Zhuang, Damin; Zhang, Huan

    2014-01-01

    Behavioral performance, subjective assessment based on NASA Task Load Index (NASA-TLX), as well as physiological measures indexed by electrocardiograph (ECG), event-related potential (ERP), and eye tracking data were used to assess the mental workload (MW) related to flight tasks. Flight simulation tasks were carried out by 12 healthy participants under different MW conditions. The MW conditions were manipulated by setting the quantity of flight indicators presented on the head-up display (HUD) in the cruise phase. In this experiment, the behavioral performance and NASA-TLX could reflect the changes of MW ideally. For physiological measures, the indices of heart rate variability (HRV), P3a, pupil diameter and eyelid opening were verified to be sensitive to MW changes. Our findings can be applied to the comprehensive evaluation of MW during flight tasks and the further quantitative classification.

  14. Enriching semantic knowledge bases for opinion mining in big data applications.

    PubMed

    Weichselbraun, A; Gindl, S; Scharl, A

    2014-10-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process.

  15. Evaluation of electron mobility in InSb quantum wells by means of percentage-impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishima, T. D.; Edirisooriya, M.; Santos, M. B.

    2014-05-15

    In order to quantitatively analyze the contribution of each scattering factor toward the total carrier mobility, we use a new convenient figure-of-merit, named a percentage impact. The mobility limit due to a scattering factor, which is widely used to summarize a scattering analysis, has its own advantage. However, a mobility limit is not quite appropriate for the above purpose. A comprehensive understanding of the difference in contribution among many scattering factors toward the total carrier mobility can be obtained by evaluating percentage impacts of scattering factors, which can be straightforwardly calculated from their mobility limits and the total mobility. Ourmore » percentage impact analysis shows that threading dislocation is one of the dominant scattering factors for the electron transport in InSb quantum wells at room temperature.« less

  16. Assessment of primary health care: health professionals' perspective.

    PubMed

    Silva, Simone Albino da; Nogueira, Denismar Alves; Paraizo, Camila Maria da Silva; Fracolli, Lislaine Aparecida

    2014-08-01

    Objective To assess primary health care attributes of access to a first contact, comprehensiveness, coordination, continuity, family guidance and community orientation. Method An evaluative, quantitative and cross-sectional study with 35 professional teams in the Family Health Program of the Alfenas region, Minas Gerais, Brazil. Data collection was done with the Primary Care Assessment Tool - Brazil, professional version. Results Results revealed a low percentage of medical experts among the participants who evaluated the attributes with high scores, with the exception of access to a first contact. Data analysis revealed needs for improvement: hours of service; forms of communication between clients and healthcare services and between clients and professionals; the mechanism of counter-referral. Conclusion It was concluded that there is a mismatch between the provision of services and the needs of the population, which compromises the quality of primary health care.

  17. Promote quantitative ischemia imaging via myocardial perfusion CT iterative reconstruction with tensor total generalized variation regularization

    NASA Astrophysics Data System (ADS)

    Gu, Chengwei; Zeng, Dong; Lin, Jiahui; Li, Sui; He, Ji; Zhang, Hao; Bian, Zhaoying; Niu, Shanzhou; Zhang, Zhang; Huang, Jing; Chen, Bo; Zhao, Dazhe; Chen, Wufan; Ma, Jianhua

    2018-06-01

    Myocardial perfusion computed tomography (MPCT) imaging is commonly used to detect myocardial ischemia quantitatively. A limitation in MPCT is that an additional radiation dose is required compared to unenhanced CT due to its repeated dynamic data acquisition. Meanwhile, noise and streak artifacts in low-dose cases are the main factors that degrade the accuracy of quantifying myocardial ischemia and hamper the diagnostic utility of the filtered backprojection reconstructed MPCT images. Moreover, it is noted that the MPCT images are composed of a series of 2/3D images, which can be naturally regarded as a 3/4-order tensor, and the MPCT images are globally correlated along time and are sparse across space. To obtain higher fidelity ischemia from low-dose MPCT acquisitions quantitatively, we propose a robust statistical iterative MPCT image reconstruction algorithm by incorporating tensor total generalized variation (TTGV) regularization into a penalized weighted least-squares framework. Specifically, the TTGV regularization fuses the spatial correlation of the myocardial structure and the temporal continuation of the contrast agent intake during the perfusion. Then, an efficient iterative strategy is developed for the objective function optimization. Comprehensive evaluations have been conducted on a digital XCAT phantom and a preclinical porcine dataset regarding the accuracy of the reconstructed MPCT images, the quantitative differentiation of ischemia and the algorithm’s robustness and efficiency.

  18. Quantitative determination and toxicity evaluation of 2,4-dichlorophenol using poly(eosin Y)/hydroxylated multi-walled carbon nanotubes modified electrode

    PubMed Central

    Zhu, Xiaolin; Zhang, Kexin; Wang, Chengzhi; Guan, Jiunian; Yuan, Xing; Li, Baikun

    2016-01-01

    This study aimed at developing simple, sensitive and rapid electrochemical approach to quantitatively determine and assess the toxicity of 2,4-dichlorophenol (2,4-DCP), a priority pollutant and has potential risk to public health through a novel poly(eosin Y, EY)/hydroxylated multi-walled carbon nanotubes composite modified electrode (PEY/MWNTs-OH/GCE). The distinct feature of this easy-fabricated electrode was the synergistic coupling effect between EY and MWNTs-OH that enabled a high electrocatalytic activity to 2,4-DCP. Under optimum conditions, the oxidation peak current enhanced linearly with concentration increasing from 0.005 to 0.1 μM and 0.2 to 40.0 μM, and revealed the detection limit of 1.5 nM. Moreover, the PEY/MWNTs-OH/GCE exhibited excellent electrocatalytic activity toward intracellular electroactive species. Two sensitive electrochemical signals ascribed to guanine/xanthine and adenine/hypoxanthine in human hepatoma (HepG2) cells were detected simultaneously. The sensor was successfully applied to evaluate the toxicity of 2,4-DCP to HepG2 cells. The IC50 values based on the two electrochemical signals are 201.07 and 252.83 μM, respectively. This study established a sensitive platform for the comprehensive evaluation of 2,4-DCP and posed a great potential to simplify environmental toxicity monitoring. PMID:27941912

  19. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    PubMed Central

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  20. Reliability of intra-oral quantitative sensory testing (QST) in patients with atypical odontalgia and healthy controls - a multicentre study.

    PubMed

    Baad-Hansen, L; Pigg, M; Yang, G; List, T; Svensson, P; Drangsholt, M

    2015-02-01

    The reliability of comprehensive intra-oral quantitative sensory testing (QST) protocol has not been examined systematically in patients with chronic oro-facial pain. The aim of the present multicentre study was to examine test-retest and interexaminer reliability of intra-oral QST measures in terms of absolute values and z-scores as well as within-session coefficients of variation (CV) values in patients with atypical odontalgia (AO) and healthy pain-free controls. Forty-five patients with AO and 68 healthy controls were subjected to bilateral intra-oral gingival QST and unilateral extratrigeminal QST (thenar) on three occasions (twice on 1 day by two different examiners and once approximately 1 week later by one of the examiners). Intra-class correlation coefficients and kappa values for interexaminer and test-retest reliability were computed. Most of the standardised intra-oral QST measures showed fair to excellent interexaminer (9-12 of 13 measures) and test-retest (7-11 of 13 measures) reliability. Furthermore, no robust differences in reliability measures or within-session variability (CV) were detected between patients with AO and the healthy reference group. These reliability results in chronic orofacial pain patients support earlier suggestions based on data from healthy subjects that intra-oral QST is sufficiently reliable for use as a part of a comprehensive evaluation of patients with somatosensory disturbances or neuropathic pain in the trigeminal region. © 2014 John Wiley & Sons Ltd.

  1. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    NASA Astrophysics Data System (ADS)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  2. Toward an Evolving Conceptualization of Instructional Leadership as Leadership for Learning: Meta-narrative Review of 109 Quantitative Studies across 25 Years

    ERIC Educational Resources Information Center

    Boyce, Jared; Bowers, Alex J.

    2018-01-01

    Purpose: Instructional leadership has been an active area of educational administration research over the past 30 years. However, there has been significant divergence in how instructional leadership has been conceptualized over time. The purpose of this paper is to present a comprehensive review of 25 years of quantitative instructional…

  3. Multiplex, quantitative cellular analysis in large tissue volumes with clearing-enhanced 3D microscopy (Ce3D)

    PubMed Central

    Li, Weizhe; Germain, Ronald N.

    2017-01-01

    Organ homeostasis, cellular differentiation, signal relay, and in situ function all depend on the spatial organization of cells in complex tissues. For this reason, comprehensive, high-resolution mapping of cell positioning, phenotypic identity, and functional state in the context of macroscale tissue structure is critical to a deeper understanding of diverse biological processes. Here we report an easy to use method, clearing-enhanced 3D (Ce3D), which generates excellent tissue transparency for most organs, preserves cellular morphology and protein fluorescence, and is robustly compatible with antibody-based immunolabeling. This enhanced signal quality and capacity for extensive probe multiplexing permits quantitative analysis of distinct, highly intermixed cell populations in intact Ce3D-treated tissues via 3D histo-cytometry. We use this technology to demonstrate large-volume, high-resolution microscopy of diverse cell types in lymphoid and nonlymphoid organs, as well as to perform quantitative analysis of the composition and tissue distribution of multiple cell populations in lymphoid tissues. Combined with histo-cytometry, Ce3D provides a comprehensive strategy for volumetric quantitative imaging and analysis that bridges the gap between conventional section imaging and disassociation-based techniques. PMID:28808033

  4. Grey Comprehensive Evaluation of Biomass Power Generation Project Based on Group Judgement

    NASA Astrophysics Data System (ADS)

    Xia, Huicong; Niu, Dongxiao

    2017-06-01

    The comprehensive evaluation of benefit is an important task needed to be carried out at all stages of biomass power generation projects. This paper proposed an improved grey comprehensive evaluation method based on triangle whiten function. To improve the objectivity of weight calculation result of only reference comparison judgment method, this paper introduced group judgment to the weighting process. In the process of grey comprehensive evaluation, this paper invited a number of experts to estimate the benefit level of projects, and optimized the basic estimations based on the minimum variance principle to improve the accuracy of evaluation result. Taking a biomass power generation project as an example, the grey comprehensive evaluation result showed that the benefit level of this project was good. This example demonstrates the feasibility of grey comprehensive evaluation method based on group judgment for benefit evaluation of biomass power generation project.

  5. Antipsychotic-induced weight gain: a comprehensive research synthesis.

    PubMed

    Allison, D B; Mentore, J L; Heo, M; Chandler, L P; Cappelleri, J C; Infante, M C; Weiden, P J

    1999-11-01

    The purpose of this study was to estimate and compare the effects of antipsychotics-both the newer ones and the conventional ones-on body weight. A comprehensive literature search identified 81 English- and non-English-language articles that included data on weight change in antipsychotic-treated patients. For each agent, a meta-analysis and random effects metaregression estimated the weight change after 10 weeks of treatment at a standard dose. A comprehensive narrative review was also conducted on all articles that did not yield quantitative information but did yield important qualitative information. Placebo was associated with a mean weight reduction of 0.74 kg. Among conventional agents, mean weight change ranged from a reduction of 0.39 kg with molindone to an increase of 3.19 kg with thioridazine. Among newer antipsychotic agents, mean increases were as follows: clozapine, 4.45 kg; olanzapine, 4.15 kg; sertindole, 2.92 kg; risperidone, 2.10 kg; and ziprasidone, 0.04 kg. Insufficient data were available to evaluate quetiapine at 10 weeks. Both conventional and newer antipsychotics are associated with weight gain. Among the newer agents, clozapine appears to have the greatest potential to induce weight gain, and ziprasidone the least. The differences among newer agents may affect compliance with medication and health risk.

  6. Evaluation of risk communication in a mammography patient decision aid.

    PubMed

    Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B

    2016-07-01

    We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Evaluation of risk communication in a mammography patient decision aid

    PubMed Central

    Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.

    2016-01-01

    Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020

  8. [Comprehensibility of online-based patient education material in ophthalmology].

    PubMed

    Heim, N; Faron, A; Fuchs, J; Martini, M; Reich, R H; Löffler, K

    2017-05-01

    Investigations have shown that the internet as a source of information in medical issues is increasing in importance. For most patients information delivered or supported by hospitals and universities is considered to be the most reliable, however, the comprehensibility of available information is often considered to be wanting. Comprehensibility scores are formulae allowing a quantitative value for the readability of a document to be calculated. The purpose of this study was to assess data by analyzing the comprehensibility of medical information published on the websites of departments for ophthalmology of German university hospitals. We investigated and analyzed medical information dealing with three eye diseases with potentially severe irreversible damage. The websites of 32 departments for ophthalmology of German university hospitals were investigated. Information regarding cataracts, glaucoma and retinal detachment (amotio retinae) were identified and analyzed. All information was systematically analyzed regarding comprehensibility by using the analysis program Text-Lab ( http://www.text-lab.de ) by calculation of five readability scores: the Hohenheim comprehensibility index (HVI), the Amstad index, the simple measure of gobbledygook (G-SMOG) index, the Vienna non-fictional text formula (W-STX) and the readability index (LIX). In 59 cases (61.46 %) useful text information from the homepage of the institutions could be detected and analyzed. On average the comprehensibility of the information was identified as being poor (HVI 7.91 ± 3.94, Amstad index 35.45 ± 11.85, Vienna formula 11.19 ± 1.93, G‑SMOG 9.77 ± 1.42 and the LIX 54.53 ± 6.67). In most of the cases patient information material was written far above the literacy level of the average population. It must be assumed that the presented information is difficult to read for the majority of the patients. A critical evaluation of accessible information material seems to be desirable and available texts should be amended.

  9. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  10. Landscapes: A State-of-the-Art Assessment of Reading Comprehension Research 1974-1984. Final Report, Volume 1.

    ERIC Educational Resources Information Center

    Crismore, Avon, Ed.

    The 594 studies referenced in this volume, the first of a two-volume set, synthesize reading comprehension research conducted from 1974 to 1984. The 12 chapters cover the following topics: (1) the necessity for both quantitative and qualitative meta-analysis for a complete state-of-the-art understanding of any domain, (2) the background of the…

  11. A Quantitative Approach to Determine Analogous Areas Using Environmental Parameters

    DTIC Science & Technology

    2008-03-01

    degrees Celsius COADS Comprehensive Ocean - Atmosphere Data Set CONUS Continental United States CTD Conductivity/Temperature/Depth probe...consolidation of a marine database. Out of this effort came the Comprehensive Ocean - Atmosphere Data Set (COADS). The original 17 data sets were...National Oceanic and Atmospheric Administration (NOAA) has compiled a database of total sediment thickness of the global oceans and seas. These data are

  12. Properties of young massive clusters obtained with different massive-star evolutionary models

    NASA Astrophysics Data System (ADS)

    Wofford, Aida; Charlot, Stéphane

    We undertake a comprehensive comparative test of seven widely-used spectral synthesis models using multi-band HST photometry of a sample of eight YMCs in two galaxies. We provide a first quantitative estimate of the accuracies and uncertainties of new models, show the good progress of models in fitting high-quality observations, and highlight the need of further comprehensive comparative tests.

  13. ["So I have no way of measuring": a portrait of health education groups in the city of São Paulo, Brazil].

    PubMed

    Vincha, Kellem Regina Rosendo; Vieira, Viviane Laudelino; Guerra, Lúcia Dias da Silva; Botelho, Fernanda Cangussu; Pava-Cárdenas, Alexandra; Cervato-Mancuso, Ana Maria

    2017-09-28

    : The study analyzed the social representations of primary health care professionals on evaluative processes of groups that work with food and nutrition, and described the educational strategies used in this care. This was a qualitative study from 2012 to 2014 in the city of São Paulo, Brazil, in which 48 interviews were analyzed. In the analysis of the interviews, for classification of the educational strategies in learning categories and contents, Bogdan & Biklen and Zabala were used, respectively. The evaluative processes used the collective subject discourse technique, based on Jodelet's social representations. Three learning contents were found in the educational strategies and four social representations of the evaluative processes which combined to reveal the presence of a conflict by a practice directed by the work process to quantitative and individual evaluative criteria and a health-promoting practice that used inclusive approaches and participant evaluation. In this practice, the study implicitly identified the presence of autonomy in health. The study revealed the need to acknowledge and systematize group planning as an educational tool that qualifies and empowers comprehensive care.

  14. Undergraduate medical education programme renewal: a longitudinal context, input, process and product evaluation study.

    PubMed

    Mirzazadeh, Azim; Gandomkar, Roghayeh; Hejri, Sara Mortaz; Hassanzadeh, Gholamreza; Koochak, Hamid Emadi; Golestani, Abolfazl; Jafarian, Ali; Jalili, Mohammad; Nayeri, Fatemeh; Saleh, Narges; Shahi, Farhad; Razavi, Seyed Hasan Emami

    2016-02-01

    The purpose of this study was to utilize the Context, Input, Process and Product (CIPP) evaluation model as a comprehensive framework to guide initiating, planning, implementing and evaluating a revised undergraduate medical education programme. The eight-year longitudinal evaluation study consisted of four phases compatible with the four components of the CIPP model. In the first phase, we explored the strengths and weaknesses of the traditional programme as well as contextual needs, assets, and resources. For the second phase, we proposed a model for the programme considering contextual features. During the process phase, we provided formative information for revisions and adjustments. Finally, in the fourth phase, we evaluated the outcomes of the new undergraduate medical education programme in the basic sciences phase. Information was collected from different sources such as medical students, faculty members, administrators, and graduates, using various qualitative and quantitative methods including focus groups, questionnaires, and performance measures. The CIPP model has the potential to guide policy makers to systematically collect evaluation data and to manage stakeholders' reactions at each stage of the reform in order to make informed decisions. However, the model may result in evaluation burden and fail to address some unplanned evaluation questions.

  15. Attitude towards Continuous and Comprehensive Evaluation of High School Students

    ERIC Educational Resources Information Center

    Cyril, A. Vences; Jeyasekaran, D.

    2016-01-01

    Continuous and Comprehensive Evaluation (CCE) refers to a system of school-based evaluation introduced by CBSE in all CBSE affiliated schools across the country to evaluate both scholastic and non-scholastic aspects of students' growth and development. Continuous and comprehensive evaluation is to evaluate every aspect of the child during their…

  16. Creating the Web-based Intensive Care Unit Safety Reporting System

    PubMed Central

    Holzmueller, Christine G.; Pronovost, Peter J.; Dickman, Fern; Thompson, David A.; Wu, Albert W.; Lubomski, Lisa H.; Fahey, Maureen; Steinwachs, Donald M.; Engineer, Lilly; Jaffrey, Ali; Morlock, Laura L.; Dorman, Todd

    2005-01-01

    In an effort to improve patient safety, researchers at the Johns Hopkins University designed and implemented a comprehensive Web-based Intensive Care Unit Safety Reporting System (ICUSRS). The ICUSRS collects data about adverse events and near misses from all staff in the ICU. This report reflects data on 854 reports from 18 diverse ICUs across the United States. Reporting is voluntary, and data collected is confidential, with patient, provider, and reporter information deidentified. Preliminary data include system factors reported, degree of patient harm, reporting times, and evaluations of the system. Qualitative and quantitative data are reported back to the ICU site study teams and frontline staff through monthly reports, case discussions, and a quarterly newsletter. PMID:15561794

  17. Impact of implementing an Internal Bed Regulation Committee on administrative and care indicators at a teaching hospital

    PubMed Central

    Rodrigues, Luciane Cristine Ribeiro; Juliani, Carmen Maria Casquel Monti

    2015-01-01

    Objective To compare hospital indicators before and after implementing an Internal Bed Regulation Committee at a reference hospital. Methods It is an quantitative, evaluation, exploratory, descriptive and cross-sectional research. The data was gathered from the hospital administrative reports for the period 2008-2013, provided by the Information Technology Center of the Complexo FAMEMA. Results The indicators improved after implementation of the Internal Bed Regulation Committee. Conclusion The individuals involved in the process acknowledged the improvement. It is necessary to carry on the regulatory actions, especially in a comprehensive and complex healthcare system, such as the brazilian Sistema Único de Saúde. PMID:25993075

  18. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  19. Quantitative probe of the transition metal redox in battery electrodes through soft x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli

    2016-10-01

    Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.

  20. Quantitative image processing in fluid mechanics

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  1. Methodology for the evaluation of the Stephanie Alexander Kitchen Garden program.

    PubMed

    Gibbs, L; Staiger, P K; Townsend, M; Macfarlane, S; Gold, L; Block, K; Johnson, B; Kulas, J; Waters, E

    2013-04-01

    Community and school cooking and gardening programs have recently increased internationally. However, despite promising indications, there is limited evidence of their effectiveness. This paper presents the evaluation framework and methods negotiated and developed to meet the information needs of all stakeholders for the Stephanie Alexander Kitchen Garden (SAKG) program, a combined cooking and gardening program implemented in selectively funded primary schools across Australia. The evaluation used multiple aligned theoretical frameworks and models, including a public health ecological approach, principles of effective health promotion and models of experiential learning. The evaluation is a non-randomised comparison of six schools receiving the program (intervention) and six comparison schools (all government-funded primary schools) in urban and rural areas of Victoria, Australia. A mixed-methods approach was used, relying on qualitative measures to understand changes in school cultures and the experiential impacts on children, families, teachers, parents and volunteers, and quantitative measures at baseline and 1 year follow up to provide supporting information regarding patterns of change. The evaluation study design addressed the limitations of many existing evaluation studies of cooking or garden programs. The multistrand approach to the mixed methodology maintained the rigour of the respective methods and provided an opportunity to explore complexity in the findings. Limited sensitivity of some of the quantitative measures was identified, as well as the potential for bias in the coding of the open-ended questions. The SAKG evaluation methodology will address the need for appropriate evaluation approaches for school-based kitchen garden programs. It demonstrates the feasibility of a meaningful, comprehensive evaluation of school-based programs and also demonstrates the central role qualitative methods can have in a mixed-method evaluation. So what? This paper contributes to debate about appropriate evaluation approaches to meet the information needs of all stakeholders and will support the sharing of measures and potential comparisons between program outcomes for comparable population groups and settings.

  2. Electromagnetic pulsed thermography for natural cracks inspection

    NASA Astrophysics Data System (ADS)

    Gao, Yunlai; Tian, Gui Yun; Wang, Ping; Wang, Haitao; Gao, Bin; Woo, Wai Lok; Li, Kongjing

    2017-02-01

    Emerging integrated sensing and monitoring of material degradation and cracks are increasingly required for characterizing the structural integrity and safety of infrastructure. However, most conventional nondestructive evaluation (NDE) methods are based on single modality sensing which is not adequate to evaluate structural integrity and natural cracks. This paper proposed electromagnetic pulsed thermography for fast and comprehensive defect characterization. It hybrids multiple physical phenomena i.e. magnetic flux leakage, induced eddy current and induction heating linking to physics as well as signal processing algorithms to provide abundant information of material properties and defects. New features are proposed using 1st derivation that reflects multiphysics spatial and temporal behaviors to enhance the detection of cracks with different orientations. Promising results that robust to lift-off changes and invariant features for artificial and natural cracks detection have been demonstrated that the proposed method significantly improves defect detectability. It opens up multiphysics sensing and integrated NDE with potential impact for natural understanding and better quantitative evaluation of natural cracks including stress corrosion crack (SCC) and rolling contact fatigue (RCF).

  3. A multicriteria decision analysis of augmentative treatment of upper limbs in persons with tetraplegia.

    PubMed

    Hummel, J M Marjan; Snoek, Govert J; van Til, Janine A; van Rossum, Wouter; Ijzerman, Maarten J

    2005-01-01

    This study supported the evaluation by a rehabilitation team of the performance of two treatment options that improve the arm-hand function in subjects with sixth cervical vertebra (C6) level Motor Group 2 tetraplegia. The analytic hierarchy process, a technique for multicriteria decision analysis, was used by a rehabilitation team and potential recipients to quantitatively compare a new technology, Functional Elec trical Stimulation (FES), with conventional surgery. Perform-ance was measured by functional improvement, treatment load, risks, user-friendliness, and social outcomes. Functional improvement after FES was considered better than that after conventional surgery. However, the rehabilitation team's overall rating for conventional surgery was slightly higher than that for FES (57% vs 44%). Compared with the rehabilitation team, potential recipients gave greater weight to burden of treatment and less weight to functional improvement. This study shows that evaluation of new technology must be more comprehensive than the evaluation of functional improvement alone, and that patient preferences may differ from those of the rehabilitation team.

  4. A quantitative framework to evaluate modeling of cortical development by neural stem cells

    PubMed Central

    Stein, Jason L.; de la Torre-Ubieta, Luis; Tian, Yuan; Parikshak, Neelroop N.; Hernandez, Israel A.; Marchetto, Maria C.; Baker, Dylan K.; Lu, Daning; Hinman, Cassidy R.; Lowe, Jennifer K.; Wexler, Eric M.; Muotri, Alysson R.; Gage, Fred H.; Kosik, Kenneth S.; Geschwind, Daniel H.

    2014-01-01

    Summary Neural stem cells have been adopted to model a wide range of neuropsychiatric conditions in vitro. However, how well such models correspond to in vivo brain has not been evaluated in an unbiased, comprehensive manner. We used transcriptomic analyses to compare in vitro systems to developing human fetal brain and observed strong conservation of in vivo gene expression and network architecture in differentiating primary human neural progenitor cells (phNPCs). Conserved modules are enriched in genes associated with ASD, supporting the utility of phNPCs for studying neuropsychiatric disease. We also developed and validated a machine learning approach called CoNTExT that identifies the developmental maturity and regional identity of in vitro models. We observed strong differences between in vitro models, including hiPSC-derived neural progenitors from multiple laboratories. This work provides a systems biology framework for evaluating in vitro systems and supports their value in studying the molecular mechanisms of human neurodevelopmental disease. PMID:24991955

  5. [Land use and land cover charnge (LUCC) and landscape service: Evaluation, mapping and modeling].

    PubMed

    Song, Zhang-jian; Cao, Yu; Tan, Yong-zhong; Chen, Xiao-dong; Chen, Xian-peng

    2015-05-01

    Studies on ecosystem service from landscape scale aspect have received increasing attention from researchers all over the world. Compared with ecosystem scale, it should be more suitable to explore the influence of human activities on land use and land cover change (LUCC), and to interpret the mechanisms and processes of sustainable landscape dynamics on landscape scale. Based on comprehensive and systematic analysis of researches on landscape service, this paper firstly discussed basic concepts and classification of landscape service. Then, methods of evaluation, mapping and modeling of landscape service were analyzed and concluded. Finally, future trends for the research on landscape service were proposed. It was put forward that, exploring further connotation and classification system of landscape service, improving methods and quantitative indicators for evaluation, mapping and modelling of landscape service, carrying out long-term integrated researches on landscape pattern-process-service-scale relationships and enhancing the applications of theories and methods on landscape economics and landscape ecology are very important fields of the research on landscape service in future.

  6. a Performance Comparison of Feature Detectors for Planetary Rover Mapping and Localization

    NASA Astrophysics Data System (ADS)

    Wan, W.; Peng, M.; Xing, Y.; Wang, Y.; Liu, Z.; Di, K.; Teng, B.; Mao, X.; Zhao, Q.; Xin, X.; Jia, M.

    2017-07-01

    Feature detection and matching are key techniques in computer vision and robotics, and have been successfully implemented in many fields. So far there is no performance comparison of feature detectors and matching methods for planetary mapping and rover localization using rover stereo images. In this research, we present a comprehensive evaluation and comparison of six feature detectors, including Moravec, Förstner, Harris, FAST, SIFT and SURF, aiming for optimal implementation of feature-based matching in planetary surface environment. To facilitate quantitative analysis, a series of evaluation criteria, including distribution evenness of matched points, coverage of detected points, and feature matching accuracy, are developed in the research. In order to perform exhaustive evaluation, stereo images, simulated under different baseline, pitch angle, and interval of adjacent rover locations, are taken as experimental data source. The comparison results show that SIFT offers the best overall performance, especially it is less sensitive to changes of image taken at adjacent locations.

  7. Tendencies of Comprehension and Acceptance of Theory of Evolution: A Study Involving Students from the Faculties of Education and Theology

    ERIC Educational Resources Information Center

    Bilen, Kadir; Ercan, Orhan

    2016-01-01

    The current study aimed to identify the views of students from the Faculty of Education and the Faculty of Theology from the University of Kahramanmaras Sutcu Imam regarding their comprehension and acceptance of the theory of evolution. A survey model was used involving a quantitative research design. The working group of the study was composed of…

  8. Community College Student Retention: Determining the Effects of a Comprehensive Support and Access Intervention Program Targeting Low-Income and Working Poor at a Large Urban Minority-Serving Institution

    ERIC Educational Resources Information Center

    Saltiel, Henry

    2011-01-01

    A quasi-experiment using quantitative methods was conducted to examine the effects on academic student outcomes when a cohort of employed low-SES community college commuter students (the treatment group, N=198) participated in a comprehensive support and access intervention program, compared with similar students (the matched comparison group,…

  9. Quantitative DNA Methylation Analysis Identifies a Single CpG Dinucleotide Important for ZAP-70 Expression and Predictive of Prognosis in Chronic Lymphocytic Leukemia

    PubMed Central

    Claus, Rainer; Lucas, David M.; Stilgenbauer, Stephan; Ruppert, Amy S.; Yu, Lianbo; Zucknick, Manuela; Mertens, Daniel; Bühler, Andreas; Oakes, Christopher C.; Larson, Richard A.; Kay, Neil E.; Jelinek, Diane F.; Kipps, Thomas J.; Rassenti, Laura Z.; Gribben, John G.; Döhner, Hartmut; Heerema, Nyla A.; Marcucci, Guido; Plass, Christoph; Byrd, John C.

    2012-01-01

    Purpose Increased ZAP-70 expression predicts poor prognosis in chronic lymphocytic leukemia (CLL). Current methods for accurately measuring ZAP-70 expression are problematic, preventing widespread application of these tests in clinical decision making. We therefore used comprehensive DNA methylation profiling of the ZAP-70 regulatory region to identify sites important for transcriptional control. Patients and Methods High-resolution quantitative DNA methylation analysis of the entire ZAP-70 gene regulatory regions was conducted on 247 samples from patients with CLL from four independent clinical studies. Results Through this comprehensive analysis, we identified a small area in the 5′ regulatory region of ZAP-70 that showed large variability in methylation in CLL samples but was universally methylated in normal B cells. High correlation with mRNA and protein expression, as well as activity in promoter reporter assays, revealed that within this differentially methylated region, a single CpG dinucleotide and neighboring nucleotides are particularly important in ZAP-70 transcriptional regulation. Furthermore, by using clustering approaches, we identified a prognostic role for this site in four independent data sets of patients with CLL using time to treatment, progression-free survival, and overall survival as clinical end points. Conclusion Comprehensive quantitative DNA methylation analysis of the ZAP-70 gene in CLL identified important regions responsible for transcriptional regulation. In addition, loss of methylation at a specific single CpG dinucleotide in the ZAP-70 5′ regulatory sequence is a highly predictive and reproducible biomarker of poor prognosis in this disease. This work demonstrates the feasibility of using quantitative specific ZAP-70 methylation analysis as a relevant clinically applicable prognostic test in CLL. PMID:22564988

  10. Quantification of liver fat: A comprehensive review.

    PubMed

    Goceri, Evgin; Shah, Zarine K; Layman, Rick; Jiang, Xia; Gurcan, Metin N

    2016-04-01

    Fat accumulation in the liver causes metabolic diseases such as obesity, hypertension, diabetes or dyslipidemia by affecting insulin resistance, and increasing the risk of cardiac complications and cardiovascular disease mortality. Fatty liver diseases are often reversible in their early stage; therefore, there is a recognized need to detect their presence and to assess its severity to recognize fat-related functional abnormalities in the liver. This is crucial in evaluating living liver donors prior to transplantation because fat content in the liver can change liver regeneration in the recipient and donor. There are several methods to diagnose fatty liver, measure the amount of fat, and to classify and stage liver diseases (e.g. hepatic steatosis, steatohepatitis, fibrosis and cirrhosis): biopsy (the gold-standard procedure), clinical (medical physics based) and image analysis (semi or fully automated approaches). Liver biopsy has many drawbacks: it is invasive, inappropriate for monitoring (i.e., repeated evaluation), and assessment of steatosis is somewhat subjective. Qualitative biomarkers are mostly insufficient for accurate detection since fat has to be quantified by a varying threshold to measure disease severity. Therefore, a quantitative biomarker is required for detection of steatosis, accurate measurement of severity of diseases, clinical decision-making, prognosis and longitudinal monitoring of therapy. This study presents a comprehensive review of both clinical and automated image analysis based approaches to quantify liver fat and evaluate fatty liver diseases from different medical imaging modalities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Life Cycle Energy Analysis of Reclaimed Water Reuse Projects in Beijing.

    PubMed

    Fan, Yupeng; Guo, Erhui; Zhai, Yuanzheng; Chang, Andrew C; Qiao, Qi; Kang, Peng

    2018-01-01

      To illustrate the benefits of water reuse project, the process-based life cycle analysis (LCA) could be combined with input-output LCA to evaluate the water reuse project. Energy is the only evaluation parameter used in this study. Life cycle assessment of all energy inputs (LCEA) is completed mainly by the life cycle inventory (LCI), taking into account the full life cycle including the construction, the operation, and the demolition phase of the project. Assessment of benefit from water reuse during the life cycle should focus on wastewater discharge reduction and water-saving benefits. The results of LCEA of Beijing water reuse project built in 2014 in a comprehensive way shows that the benefits obtained from the reclaimed water reuse far exceed the life cycle energy consumption. In this paper, the authors apply the LCEA model to estimate the benefits of reclaimed water reuse projects quantitatively.

  12. Flight test and evaluation of Omega navigation in a general aviation aircraft. Volume 1: Technical

    NASA Technical Reports Server (NTRS)

    Howell, J. D.; Hoffman, W. C.; Hwoschinsky, P. V.; Wischmeyer, C. E.

    1975-01-01

    A low cost flight research program was conducted to evaluate the performance of differential Omega navigation in a general aviation aircraft. The flight program consisted of two distinct parts corresponding to the two major objectives of the study. The Wallops Flight Program was conducted to obtain Omega signal and phase data in the Wallops Flight Center vicinity to provide preliminary technical information and experience in preparation for a comprehensive NASA/FAA flight test program of an experimental differential Omega system. The Northeast Corridor Flight Program was conducted to examine Omega operational suitability and performance on low altitude area navigation (RNAV) routes for city-center to city-center VTOL commercial operations in the Boston-New York-Washington corridor. The development, execution and conclusions of the flight research program are discribed. The results of the study provide both quantitative and qualitative data on the Omega Navigation System under actual operating conditions.

  13. Zn or O? An Atomic Level Comparison on Antibacterial Activities of Zinc Oxides.

    PubMed

    Yu, Fen; Fang, Xuan; Jia, Huimin; Liu, Miaoxing; Shi, Xiaotong; Xue, Chaowen; Chen, Tingtao; Wei, Zhipeng; Fang, Fang; Zhu, Hui; Xin, Hongbo; Feng, Jing; Wang, Xiaolei

    2016-06-06

    For the first time, the influence of different types of atoms (Zn and O) on the antibacterial activities of nanosized ZnO was quantitatively evaluated with the aid of a 3D-printing-manufactured evaluation system. Two different outermost atomic layers were manufactured separately by using an ALD (atomic layer deposition) method. Interestingly, we found that each outermost atomic layer exhibited certain differences against gram-positive or gram-negative bacterial species. Zinc atoms as outermost layer (ZnO-Zn) showed a more pronounced antibacterial effect towards gram-negative E. coli (Escherichia coli), whereas oxygen atoms (ZnO-O) showed a stronger antibacterial activity against gram-positive S. aureus (Staphylococcus aureus). A possible antibacterial mechanism has been comprehensively discussed from different perspectives, including Zn(2+) concentrations, oxygen vacancies, photocatalytic activities and the DNA structural characteristics of different bacterial species. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Enriching semantic knowledge bases for opinion mining in big data applications

    PubMed Central

    Weichselbraun, A.; Gindl, S.; Scharl, A.

    2014-01-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process. PMID:25431524

  15. Identification and quantitative evaluation of the fiber structure in the pathological tissue using Mueller matrix microscope

    NASA Astrophysics Data System (ADS)

    Zhou, Jialing; He, Honghui; Wang, Ye; Ma, Hui

    2017-02-01

    Fiber structure changes in the various pathological processes, such as the increase of fibrosis in liver diseases, the derangement of fiber in cervical cancer and so on. Currently, clinical pathologic diagnosis is regarded as the golden criterion, but different doctors with discrepancy in knowledge and experience may obtain different conclusions. Up to a point, quantitative evaluation of the fiber structure in the pathological tissue can be of great service to quantitative diagnosis. Mueller matrix measurement is capable of probing comprehensive microstructural information of samples and different wavelength of lights can provide more information. In this paper, we use a Mueller matrix microscope with light sources in six different wavelength. We use unstained, dewaxing liver tissue slices in four stages and the pathological biopsy of the filtration channels from rabbit eyes as samples. We apply the Mueller matrix polar decomposition (MMPD) parameter δ which corresponds to retardance to liver slices. The mean value of abnormal region get bigger when the level of fibrosis get higher and light in short wavelength is more sensitive to the microstructure of fiber. On the other hand, we use the Mueller matrix transformation (MMT) parameter Φ which is associated to the angel of fast axis in the analysis of the slices of the filtration channels from rabbit eyes. The value of kurtosis and the value of skewness shows big difference between new born region and normal region and can reveal the arrangement of fiber. These results indicate that the Mueller matrix microscope has great potential in auxiliary diagnosis.

  16. Neurobiological bases of reading comprehension: Insights from neuroimaging studies of word level and text level processing in skilled and impaired readers

    PubMed Central

    Landi, Nicole; Frost, Stephen J.; Menc, W. Einar; Sandak, Rebecca; Pugh, Kenneth R.

    2012-01-01

    For accurate reading comprehension, readers must first learn to map letters to their corresponding speech sounds and meaning and then they must string the meanings of many words together to form a representation of the text. Furthermore, readers must master the complexities involved in parsing the relevant syntactic and pragmatic information necessary for accurate interpretation. Failure in this process can occur at multiple levels and cognitive neuroscience has been helpful in identifying the underlying causes of success and failure in reading single words and in reading comprehension. In general, neurobiological studies of skilled reading comprehension indicate a highly overlapping language circuit for single word reading, reading comprehension and listening comprehension with largely quantitative differences in a number of reading and language related areas. This paper reviews relevant research from studies employing neuroimaging techniques to study reading with a focus on the relationship between reading skill, single word reading, and text comprehension. PMID:23662034

  17. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  18. Electron paramagnetic resonance oximetry as a quantitative method to measure cellular respiration: a consideration of oxygen diffusion interference.

    PubMed

    Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L; Ilangovan, Govindasamy

    2006-12-15

    Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO(2)-independent respiration at higher pO(2) ranges, pO(2)-dependent respiration at low pO(2) ranges, and a static equilibrium with no change in pO(2) at very low pO(2) values. The approach here enables one to comprehensively analyze all of the three zones together-where the progression of O(2) diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO(2) data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry.

  19. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  20. Evaluation of different derivatisation approaches for gas chromatographic-mass spectrometric analysis of carbohydrates in complex matrices of biological and synthetic origin.

    PubMed

    Becker, M; Zweckmair, T; Forneck, A; Rosenau, T; Potthast, A; Liebner, F

    2013-03-15

    Gas chromatographic analysis of complex carbohydrate mixtures requires highly effective and reliable derivatisation strategies for successful separation, identification, and quantitation of all constituents. Different single-step (per-trimethylsilylation, isopropylidenation) and two-step approaches (ethoximation-trimethylsilylation, ethoximation-trifluoroacetylation, benzoximation-trimethylsilylation, benzoximation-trifluoroacetylation) have been comprehensively studied with regard to chromatographic characteristics, informational value of mass spectra, ease of peak assignment, robustness toward matrix effects, and quantitation using a set of reference compounds that comprise eight monosaccharides (C(5)-C(6)), glycolaldehyde, and dihydroxyacetone. It has been shown that isopropylidenation and the two oximation-trifluoroacetylation approaches are least suitable for complex carbohydrate matrices. Whereas the former is limited to compounds that contain vicinal dihydroxy moieties in cis configuration, the latter two methods are sensitive to traces of trifluoroacetic acid which strongly supports decomposition of ketohexoses. It has been demonstrated for two "real" carbohydrate-rich matrices of biological and synthetic origin, respectively, that two-step ethoximation-trimethylsilylation is superior to other approaches due to the low number of peaks obtained per carbohydrate, good peak separation performance, structural information of mass spectra, low limits of detection and quantitation, minor relative standard deviations, and low sensitivity toward matrix effects. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Patient-focused goal planning process and outcome after spinal cord injury rehabilitation: quantitative and qualitative audit.

    PubMed

    Byrnes, Michelle; Beilby, Janet; Ray, Patricia; McLennan, Renee; Ker, John; Schug, Stephan

    2012-12-01

    To evaluate the process and outcome of a multidisciplinary inpatient goal planning rehabilitation programme on physical, social and psychological functioning for patients with spinal cord injury. Clinical audit: quantitative and qualitative analyses. Specialist spinal injury unit, Perth, Australia. Consecutive series of 100 newly injured spinal cord injury inpatients. MAIN MEASURE(S): The Needs Assessment Checklist (NAC), patient-focused goal planning questionnaire and goal planning progress form. The clinical audit of 100 spinal cord injured patients revealed that 547 goal planning meetings were held with 8531 goals stipulated in total. Seventy-five per cent of the goals set at the first goal planning meeting were achieved by the second meeting and the rate of goal achievements at subsequent goal planning meetings dropped to 56%. Based on quantitative analysis of physical, social and psychological functioning, the 100 spinal cord injury patients improved significantly from baseline to discharge. Furthermore, qualitative analysis revealed benefits consistently reported by spinal cord injury patients of the goal planning rehabilitation programme in improvements to their physical, social and psychological adjustment to injury. The findings of this clinical audit underpin the need for patient-focused goal planning rehabilitation programmes which are tailored to the individual's needs and involve a comprehensive multidisciplinary team.

  2. [Analysis of commercial specifications and grades of wild and cultivated Gentianae Macrophyllae Radix based on multi-indicative constituents].

    PubMed

    Yang, Yan-Mei; Lin, Li; Lu, You-Yuan; Ma, Xiao-Hui; Jin, Ling; Zhu, Tian-Tian

    2016-03-01

    The study is aimed to analyze the commercial specifications and grades of wild and cultivated Gentianae Macrophllae Radix based on multi-indicative constituents. The seven kinds of main chemical components containing in Gentianae Macrophyllae Radix were determined by UPLC, and then the quality levels of chemical component of Gentianae Macrophyllae Radix were clustered and classified by modern statistical methods (canonical correspondence analysis, Fisher discriminant analysis and so on). The quality indices were selected and their correlations were analyzed. Lastly, comprehensively quantitative grade division for quality under different commodity-specifications and different grades of same commodity-specifications of wild and planting were divided. The results provide a basis for a reasonable division of specification and grade of the commodity of Gentianae Macrophyllae Radix. The range of quality evaluation of main index components (gentiopicrin, loganin acid and swertiamarin) was proposed, and the Herbal Quality Index (HQI) was introduced. The rank discriminant function was established based on the quality by Fisher discriminant analysis. According to the analysis, the quality of wild and cultivated Luobojiao, one of the commercial specification of Gentianae Macrophyllae Radix was the best, Mahuajiao, the other commercial specification, was average , Xiaoqinjiao was inferior. Among grades, the quality of first-class cultivated Luobojiao was the worst, of second class secondary, and the third class the best; The quality of the first-class of wild Luobojiao was secondary, and the second-class the best; The quality of the second-class of Mahuajiao was secondary, and the first-class was the best; the quality of first-class Xiaoqinjiao was secondary, and the second-class was the better one between the two grades, but not obvious significantly. The method provides a new idea and method for evaluation of comprehensively quantitative on the quality of Gentianae Macrophyllae Radix. Copyright© by the Chinese Pharmaceutical Association.

  3. NHI-PharmaCloud in Taiwan--A preliminary evaluation using the RE-AIM framework and lessons learned.

    PubMed

    Huang, San-Kuei; Wang, Pen-Jen; Tseng, Wen-Fuh; Syu, Fei-Kai; Lee, Miaw-Chwen; Shih, Ru-Liang; Sheen, Mao-Ting; Chen, Michael S

    2015-10-01

    The aim of this article is to present the preliminary impact of a medication monitoring program, PharmaCloud, in Taiwan and analyze the embedded factors that have contributed to the performance thereof. This article also compared PharmaCloud with similar international programs in order to draw lessons learned. The five domains of the RE-AIM framework - reach, effectiveness, adoption, implementation, and maintenance - were examined using qualitative and quantitative data. A difference-in-differences model was applied to analyze the quantitative impact of PharmaCloud on drug utilization and drug expenses. The qualitative impact was evaluated by document analysis based on field reports from the participating medical institutions. Reach and adoption: although all of the major hospitals adopted PharmaCloud and some of the hospitals had high inquiry rates, more time and incentives are needed to raise the overall inquiry rate. Effectiveness: during the study period of 3 months, the number of medications per prescription declined in the intervention group was 0.15 more than that of the general population, and the drug expense per person declined in the intervention group was NT $567 (US $18.9) more than that of the general population. The potential savings could be between 2% and 5% of the total pharmaceutical expenditure. Medication duplication was found to have decreased more in the intervention group. a variety of innovations in care delivery are being developed in which the pharmacists play a more significant role. Maintenance: the embedded National Health Insurance would lend strong support for PharmaCloud to grow and thrive. PharmaCloud owes its effectiveness to the embedded National Health Insurance (NHI) program, which is universal and provides a comprehensive benefit package including more than 16,000 prescription drugs. An effective medication program is one that operates under the principle of universality and comprehensiveness, facilitates innovations, and has a substantial level of interoperability with the intra-hospital health information systems. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Classifying oxidative stress by F2-isoprostane levels across human diseases: A meta-analysis.

    PubMed

    van 't Erve, Thomas J; Kadiiska, Maria B; London, Stephanie J; Mason, Ronald P

    2017-08-01

    The notion that oxidative stress plays a role in virtually every human disease and environmental exposure has become ingrained in everyday knowledge. However, mounting evidence regarding the lack of specificity of biomarkers traditionally used as indicators of oxidative stress in human disease and exposures now necessitates re-evaluation. To prioritize these re-evaluations, published literature was comprehensively analyzed in a meta-analysis to quantitatively classify the levels of systemic oxidative damage across human disease and in response to environmental exposures. In this meta-analysis, the F 2 -isoprostane, 8-iso-PGF 2α , was specifically chosen as the representative marker of oxidative damage. To combine published values across measurement methods and specimens, the standardized mean differences (Hedges' g) in 8-iso-PGF 2α levels between affected and control populations were calculated. The meta-analysis resulted in a classification of oxidative damage levels as measured by 8-iso-PGF 2α across 50 human health outcomes and exposures from 242 distinct publications. Relatively small increases in 8-iso-PGF 2α levels (g<0.8) were found in the following conditions: hypertension (g=0.4), metabolic syndrome (g=0.5), asthma (g=0.4), and tobacco smoking (g=0.7). In contrast, large increases in 8-iso-PGF 2α levels were observed in pathologies of the kidney, e.g., chronic renal insufficiency (g=1.9), obstructive sleep apnoea (g=1.1), and pre-eclampsia (g=1.1), as well as respiratory tract disorders, e.g., cystic fibrosis (g=2.3). In conclusion, we have established a quantitative classification for the level of 8-iso-PGF 2α generation in different human pathologies and exposures based on a comprehensive meta-analysis of published data. This analysis provides knowledge on the true involvement of oxidative damage across human health outcomes as well as utilizes past research to prioritize those conditions requiring further scrutiny on the mechanisms of biomarker generation. Copyright © 2017. Published by Elsevier B.V.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.

    In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less

  6. Performance evaluation model of a pilot food waste collection system in Suzhou City, China.

    PubMed

    Wen, Zongguo; Wang, Yuanjia; De Clercq, Djavan

    2015-05-01

    This paper analyses the food waste collection and transportation (C&T) system in a pilot project in Suzhou by using a novel performance evaluation method. The method employed to conduct this analysis involves a unified performance evaluation index containing qualitative and quantitative indicators applied to data from Suzhou City. Two major inefficiencies were identified: a) low system efficiency due to insufficient processing capacity of commercial food waste facilities; and b) low waste resource utilization due to low efficiency of manual sorting. The performance evaluation indicated that the pilot project collection system's strong points included strong economics, low environmental impact and low social impact. This study also shows that Suzhou's integrated system has developed a comprehensive body of laws and clarified regulatory responsibilities for each of the various government departments to solve the problems of commercial food waste management. Based on Suzhou's experience, perspectives and lessons can be drawn for other cities and areas where food waste management systems are in the planning stage, or are encountering operational problems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Evaluation of data quality at the National Cancer Registry of Ukraine.

    PubMed

    Ryzhov, Anton; Bray, Freddie; Ferlay, Jacques; Fedorenko, Zoya; Goulak, Liudmyla; Gorokh, Yevgeniy; Soumkina, Olena; Znaor, Ariana

    2018-04-01

    Cancer notification has been mandatory in Ukraine since 1953, with the National Cancer Registry of Ukraine (NCRU) established in 1996. The aim of this study was to provide a comprehensive evaluation of the data quality at the NCRU. Qualitative and semi-quantitative methods were used to assess the comparability, completeness, validity and timeliness of cancer incidence data from the NCRU for the period 2002-2012. Cancer registration procedures at the NCRU are in accordance with international standards and recommendations. Semi-quantitative methods suggested the NCRU's data was reasonably complete, although decreases in age-specific incidence and mortality rates in the elderly indicated some missing cases at older ages. The proportion of microscopically-verified cases increased from 73.6% in 2002 to 82.3% in 2012, with death-certificate-only (DCO) proportions stable at around 0.1% and unknown stage recorded in 9.6% of male and 7.5% of female solid tumours. Timeliness was considered acceptable, with reporting >99% complete within a turn-around time of 15 months. While timely reporting of national data reflects the advantages of a mandatory data collection system, a low DCO% and observed age-specific declines suggest possible underreporting of incidence and mortality data, particularly at older ages. Overall, the evaluation indicates that the data are reasonably comparable and thus may be used to describe the magnitude of the cancer burden in Ukraine. Given its central role in monitoring and evaluation of cancer control activities, ensuring the sustainability of NCRU operations throughout the process of healthcare system reform is of utmost importance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Quantitative evaluation of waste prevention on the level of small and medium sized enterprises (SMEs).

    PubMed

    Laner, David; Rechberger, Helmut

    2009-02-01

    Waste prevention is a principle means of achieving the goals of waste management and a key element for developing sustainable economies. Small and medium sized enterprises (SMEs) contribute substantially to environmental degradation, often not even being aware of their environmental effects. Therefore, several initiatives have been launched in Austria aimed at supporting waste prevention measures on the level of SMEs. To promote the most efficient projects, they have to be evaluated with respect to their contribution to the goals of waste management. It is the aim of this paper to develop a methodology for evaluating waste prevention measures in SMEs based on their goal orientation. At first, conceptual problems of defining and delineating waste prevention activities are briefly discussed. Then an approach to evaluate waste prevention activities with respect to their environmental performance is presented and benchmarks which allow for an efficient use of the available funds are developed. Finally the evaluation method is applied to a number of former projects and the calculated results are analysed with respect to shortcomings and limitations of the model. It is found that the developed methodology can provide a tool for a more objective and comprehensible evaluation of waste prevention measures.

  9. Using learning analytics to evaluate a video-based lecture series.

    PubMed

    Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J

    2018-01-01

    The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.

  10. Comprehensive analysis of ß-lactam antibiotics including penicillins, cephalosporins, and carbapenems in poultry muscle using liquid chromatography coupled to tandem mass spectrometry.

    PubMed

    Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F

    2013-09-01

    A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.

  11. Multi-Response Extraction Optimization Based on Anti-Oxidative Activity and Quality Evaluation by Main Indicator Ingredients Coupled with Chemometric Analysis on Thymus quinquecostatus Celak.

    PubMed

    Chang, Yan-Li; Shen, Meng; Ren, Xue-Yang; He, Ting; Wang, Le; Fan, Shu-Sheng; Wang, Xiu-Huan; Li, Xiao; Wang, Xiao-Ping; Chen, Xiao-Yi; Sui, Hong; She, Gai-Mei

    2018-04-19

    Thymus quinquecostatus Celak is a species of thyme in China and it used as condiment and herbal medicine for a long time. To set up the quality evaluation of T. quinquecostatus , the response surface methodology (RSM) based on its 2,2-Diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activity was introduced to optimize the extraction condition, and the main indicator components were found through an UPLC-LTQ-Orbitrap MS n method. The ethanol concentration, solid-liquid ratio, and extraction time on optimum conditions were 42.32%, 1:17.51, and 1.8 h, respectively. 35 components having 12 phenolic acids and 23 flavonoids were unambiguously or tentatively identified both positive and negative modes to employ for the comprehensive analysis in the optimum anti-oxidative part. A simple, reliable, and sensitive HPLC method was performed for the multi-component quantitative analysis of T. quinquecostatus using six characteristic and principal phenolic acids and flavonoids as reference compounds. Furthermore, the chemometrics methods (principal components analysis (PCA) and hierarchical clustering analysis (HCA)) appraised the growing areas and harvest time of this herb closely relative to the quality-controlled. This study provided full-scale qualitative and quantitative information for the quality evaluation of T. quinquecostatus , which would be a valuable reference for further study and development of this herb and related laid the foundation of further study on its pharmacological efficacy.

  12. Interventions to improve patient comprehension in informed consent for medical and surgical procedures: a systematic review.

    PubMed

    Schenker, Yael; Fernandez, Alicia; Sudore, Rebecca; Schillinger, Dean

    2011-01-01

    Patient understanding in clinical informed consent is often poor. Little is known about the effectiveness of interventions to improve comprehension or the extent to which such interventions address different elements of understanding in informed consent. . To systematically review communication interventions to improve patient comprehension in informed consent for medical and surgical procedures. Data Sources. A systematic literature search of English-language articles in MEDLINE (1949-2008) and EMBASE (1974-2008) was performed. In addition, a published bibliography of empirical research on informed consent and the reference lists of all eligible studies were reviewed. Study Selection. Randomized controlled trials and controlled trials with nonrandom allocation were included if they compared comprehension in informed consent for a medical or surgical procedure. Only studies that used a quantitative, objective measure of understanding were included. All studies addressed informed consent for a needed or recommended procedure in actual patients. Data Extraction. Reviewers independently extracted data using a standardized form. All results were compared, and disagreements were resolved by consensus. Data Synthesis. Forty-four studies were eligible. Intervention categories included written information, audiovisual/multimedia, extended discussions, and test/feedback techniques. The majority of studies assessed patient understanding of procedural risks; other elements included benefits, alternatives, and general knowledge about the procedure. Only 6 of 44 studies assessed all 4 elements of understanding. Interventions were generally effective in improving patient comprehension, especially regarding risks and general knowledge. Limitations. Many studies failed to include adequate description of the study population, and outcome measures varied widely. . A wide range of communication interventions improve comprehension in clinical informed consent. Decisions to enhance informed consent should consider the importance of different elements of understanding, beyond procedural risks, as well as feasibility and acceptability of the intervention to clinicians and patients. Conceptual clarity regarding the key elements of informed consent knowledge will help to focus improvements and standardize evaluations.

  13. Interventions to Improve Patient Comprehension in Informed Consent for Medical and Surgical Procedures: A Systematic Review

    PubMed Central

    Schenker, Yael; Fernandez, Alicia; Sudore, Rebecca; Schillinger, Dean

    2017-01-01

    Background Patient understanding in clinical informed consent is often poor. Little is known about the effectiveness of interventions to improve comprehension or the extent to which such interventions address different elements of understanding in informed consent. Purpose To systematically review communication interventions to improve patient comprehension in informed consent for medical and surgical procedures. Data Sources A systematic literature search of English-language articles in MEDLINE (1949–2008) and EMBASE (1974–2008) was performed. In addition, a published bibliography of empirical research on informed consent and the reference lists of all eligible studies were reviewed. Study Selection Randomized controlled trials and controlled trials with non-random allocation were included if they compared comprehension in informed consent for a medical or surgical procedure. Only studies that used a quantitative, objective measure of understanding were included. All studies addressed informed consent for a needed or recommended procedure in actual patients. Data Extraction Reviewers independently extracted data using a standardized form. All results were compared, and disagreements were resolved by consensus. Data Synthesis Forty-four studies were eligible. Intervention categories included written information, audiovisual/multimedia, extended discussions, and test/feedback techniques. The majority of studies assessed patient understanding of procedural risks; other elements included benefits, alternatives, and general knowledge about the procedure. Only 6 of 44 studies assessed all 4 elements of understanding. Interventions were generally effective in improving patient comprehension, especially regarding risks and general knowledge. Limitations Many studies failed to include adequate description of the study population, and outcome measures varied widely. Conclusions A wide range of communication interventions improve comprehension in clinical informed consent. Decisions to enhance informed consent should consider the importance of different elements of understanding, beyond procedural risks, as well as feasibility and acceptability of the intervention to clinicians and patients. Conceptual clarity regarding the key elements of informed consent knowledge will help to focus improvements and standardize evaluations. PMID:20357225

  14. The psychophysiological assessment method for pilot's professional reliability.

    PubMed

    Zhang, L M; Yu, L S; Wang, K N; Jing, B S; Fang, C

    1997-05-01

    Previous research has shown that a pilot's professional reliability depends on two relative factors: the pilot's functional state and the demands of task workload. The Psychophysiological Reserve Capacity (PRC) is defined as a pilot's ability to accomplish additive tasks without reducing the performance of the primary task (flight task). We hypothesized that the PRC was a mirror of the pilot's functional state. The purpose of this study was to probe the psychophysiological method for evaluating a pilot's professional reliability on a simulator. The PRC Comprehensive Evaluating System (PRCCES) which was used in the experiment included four subsystems: a) quantitative evaluation system for pilot's performance on simulator; b) secondary task display and quantitative estimating system; c) multiphysiological data monitoring and statistical system; and d) comprehensive evaluation system for pilot PRC. Two studies were performed. In study one, 63 healthy and 13 hospitalized pilots participated. Each pilot performed a double 180 degrees circuit flight program with and without secondary task (three digit operation). The operator performance, score of secondary task and cost of physiological effort were measured and compared by PRCCES in the two conditions. Then, each pilot's flight skill in training was subjectively scored by instructor pilot ratings. In study two, 7 healthy pilots volunteered to take part in the experiment on the effects of sleep deprivation on pilot's PRC. Each participant had PRC tested pre- and post-8 h sleep deprivation. The results show that the PRC values of a healthy pilot was positively correlated with abilities of flexibility, operating and correcting deviation, attention distribution, and accuracy of instrument flight in the air (r = 0.27-0.40, p < 0.05), and negatively correlated with emotional anxiety in flight (r = -0.40, p < 0.05). The values of PRC in healthy pilots (0.61 +/- 0.17) were significantly higher than that of hospitalized pilots (0.43 +/- 0.15) (p < 0.05). The PRC value after 8 h sleep loss (0.50 +/- 0.17) was significantly lower than those before sleep loss (0.70 +/- 0.15) (p < 0.05). We conclude that a pilot's PRC, which was closely related to flight ability and functional state, could partly represent the pilot's professional reliability. It is worthwhile to further research using a pilot's PRC as a predictor of mental workload in aircraft design.

  15. Comprehensive chlorophyll composition in the main edible seaweeds.

    PubMed

    Chen, Kewei; Ríos, José Julián; Pérez-Gálvez, Antonio; Roca, María

    2017-08-01

    Natural chlorophylls present in seaweeds have been studied regarding their biological activities and health benefit effects. However, detailed studies regarding characterization of the complete chlorophyll profile either qualitatively and quantitatively are scarce. This work deals with the comprehensive spectrometric study of the chlorophyll derivatives present in the five main coloured edible seaweeds. The novel complete MS 2 characterization of five chlorophyll derivatives: chlorophyll c 2 , chlorophyll c 1 , purpurin-18 a, pheophytin d and phytyl-purpurin-18 a has allowed to obtain fragmentation patterns associated with their different structural features. New chlorophyll derivatives have been identified and quantified by first time in red, green and brown seaweeds, including some oxidative structures. Quantitative data of the chlorophyll content comes to achieve significant information for food composition databases in bioactive compounds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Quantitative Proteomics Identifies Activation of Hallmark Pathways of Cancer in Patient Melanoma.

    PubMed

    Byrum, Stephanie D; Larson, Signe K; Avaritt, Nathan L; Moreland, Linley E; Mackintosh, Samuel G; Cheung, Wang L; Tackett, Alan J

    2013-03-01

    Molecular pathways regulating melanoma initiation and progression are potential targets of therapeutic development for this aggressive cancer. Identification and molecular analysis of these pathways in patients has been primarily restricted to targeted studies on individual proteins. Here, we report the most comprehensive analysis of formalin-fixed paraffin-embedded human melanoma tissues using quantitative proteomics. From 61 patient samples, we identified 171 proteins varying in abundance among benign nevi, primary melanoma, and metastatic melanoma. Seventy-three percent of these proteins were validated by immunohistochemistry staining of malignant melanoma tissues from the Human Protein Atlas database. Our results reveal that molecular pathways involved with tumor cell proliferation, motility, and apoptosis are mis-regulated in melanoma. These data provide the most comprehensive proteome resource on patient melanoma and reveal insight into the molecular mechanisms driving melanoma progression.

  17. Comprehensive comparison of self-administered questionnaires for measuring quantitative autistic traits in adults.

    PubMed

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M

    2014-05-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype Questionnaire, the Social Responsiveness Scale2-Adult Self report (SRS2-AS), and the Autism-Spectrum Quotient (AQ). The SRS2-AS and the AQ each had several short forms that we also examined, bringing the total to 11 forms. Though all QAT questionnaires showed acceptable levels of test-retest reliability, the AQ and SRS2-AS, including their short forms, exhibited poor internal consistency and discriminant validity, respectively. The SATQ excelled in terms of classical test theory and due to its short length.

  18. Effectiveness of a systematic approach to promote intersectoral collaboration in comprehensive school health promotion-a multiple-case study using quantitative and qualitative data.

    PubMed

    Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K

    2015-07-05

    We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.

  19. Data quality at the Singapore Cancer Registry: An overview of comparability, completeness, validity and timeliness.

    PubMed

    Fung, Janice Wing Mei; Lim, Sandra Bee Lay; Zheng, Huili; Ho, William Ying Tat; Lee, Bee Guat; Chow, Khuan Yew; Lee, Hin Peng

    2016-08-01

    To provide a comprehensive evaluation of the quality of the data at the Singapore Cancer Registry (SCR). Quantitative and semi-quantitative methods were used to assess the comparability, completeness, accuracy and timeliness of data for the period of 1968-2013, with focus on the period 2008-2012. The SCR coding and classification systems follow international standards. The overall completeness was estimated at 98.1% using the flow method and 97.5% using the capture-recapture method, for the period of 2008-2012. For the same period, 91.9% of the cases were morphologically verified (site-specific range: 40.4-100%) with 1.1% DCO cases. The under-reporting in 2011 and 2012 due to timely publication was estimated at 0.03% and 0.51% respectively. This review shows that the processes in place at the SCR yields data which are internationally comparable, relatively complete, valid, and timely, allowing for greater confidence in the use of quality data in the areas of cancer prevention, treatment and control. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  1. Chemical profiling and quantitation of bioactive compounds in Platycladi Cacumen by UPLC-Q-TOF-MS/MS and UPLC-DAD.

    PubMed

    Zhuang, Bo; Bi, Zhi-Ming; Wang, Zi-Yuan; Duan, Li; Lai, Chang-Jiang-Sheng; Liu, E-Hu

    2018-05-30

    Platycladi Cacumen (PC) is a traditional Chinese medicine used for the treatment of hemorrhages, cough, asthma and hair loss. To get a better understanding of the chemical constituents in PC, ultra-high performance liquid chromatography coupled with electrospray ionization quadrupole time-of-flight tandem mass spectrometry (UPLC-Q-TOF-MS/MS) and diagnostic ion filtering strategy were firstly employed for chemical profiling of PC. A total of 43 compounds including organic acids and derivatives, flavonoids as well as phenylpropanolds were unambiguously or reasonably identified. Coumarin and lignan were reported for the first time in PC. Chemical variation of 39 batches of PC from different geographical origins and 10 batches of processed product of PC was subsequently investigated by quantitation of nine major flavonoids. The results determined by UPLC coupled with diode array detection (UPLC-DAD) and hierarchical cluster analysis (HCA) indicated that the contents of flavonoids in PC samples differ greatly. This work provides an efficient approach to comprehensively evaluate the quality of PC. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Metabolomics relative quantitation with mass spectrometry using chemical derivatization and isotope labeling

    DOE PAGES

    O'Maille, Grace; Go, Eden P.; Hoang, Linh; ...

    2008-01-01

    Comprehensive detection and quantitation of metabolites from a biological source constitute the major challenges of current metabolomics research. Two chemical derivatization methodologies, butylation and amination, were applied to human serum for ionization enhancement of a broad spectrum of metabolite classes, including steroids and amino acids. LC-ESI-MS analysis of the derivatized serum samples provided a significant signal elevation across the total ion chromatogram to over a 100-fold increase in ionization efficiency. It was also demonstrated that derivatization combined with isotopically labeled reagents facilitated the relative quantitation of derivatized metabolites from individual as well as pooled samples.

  3. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    PubMed

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  4. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites in tissue at therapeutic levels.

    PubMed

    Sun, Na; Walch, Axel

    2013-08-01

    Mass spectrometry imaging (MSI) is a rapidly evolving technology that yields qualitative and quantitative distribution maps of small pharmaceutical-active molecules and their metabolites in tissue sections in situ. The simplicity, high sensitivity and ability to provide comprehensive spatial distribution maps of different classes of biomolecules make MSI a valuable tool to complement histopathology for diagnostics and biomarker discovery. In this review, qualitative and quantitative MSI of drugs and metabolites in tissue at therapeutic levels are discussed and the impact of this technique in drug discovery and clinical research is highlighted.

  5. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  6. Attitudes about AIDS education and condom availability among parents of high school students in New York City: a focus group approach.

    PubMed

    Rafferty, Y; Radosh, A

    1997-02-01

    This paper describes parents' views of the New York City Public High School's AIDS Education and Condom Availability Program. It presents findings from 12 focus groups with 81 parents of students at six representative high schools. Focus groups were conducted as part of an independent, comprehensive 3-year evaluation of the program, consisting of both qualitative and quantitative components. Participants were mostly supportive of the program, citing intense concern about AIDS among adolescents, fear that teenagers do not adequately perceive themselves as being vulnerable, and personal experiences with infected relatives and friends. Implications of these findings for program development are discussed and recommendations for social policy changes are presented.

  7. The use of children's drawings in the evaluation and treatment of child sexual, emotional, and physical abuse.

    PubMed

    Peterson, L W; Hardin, M; Nitsch, M J

    1995-05-01

    Primary care physicians can be instrumental in the initial identification of potential sexual, emotional, and physical abuse of children. We reviewed the use of children's artwork as a method of communicating individual and family functioning. A quantitative method of analyzing children's artwork provides more reliability and validity than some methods used previously. A new scoring system was developed that uses individual human figure drawings and kinetic family drawings. This scoring system was based on research with 842 children (341 positively identified as sexually molested, 252 positively not sexually molested but having emotional or behavioral problems, and 249 "normal" public school children). This system is more comprehensive than previous systems of assessment of potential abuse.

  8. A comprehensive study on the damage tolerance of ultrafine-grained copper

    PubMed Central

    Hohenwarter, A.; Pippan, R.

    2012-01-01

    In this study the fracture behavior of ultrafine-grained copper was assessed by means of elasto-plastic fracture mechanics. For the synthesis of the material high pressure torsion was used. The fracture toughness was quantitatively measured by JIC as a global measure by recording the crack growth resistance curve. Additionally, the initiation toughness in terms of the crack opening displacement (CODi) was evaluated as a local fracture parameter. The results presented here exhibit a low fracture initiation toughness but simultaneously a remarkably high fracture toughness in terms of JIC. The origin of the large difference between these two parameters, peculiarities of the fracture surface and the fracture mechanical performance compared to coarse grained copper will be discussed. PMID:23471016

  9. Combining qualitative and quantitative methods to analyze serious games outcomes: A pilot study for a new cognitive screening tool.

    PubMed

    Vallejo, Vanessa; Mitache, Andrei V; Tarnanas, Ioannis; Muri, Rene; Mosimann, Urs P; Nef, Tobias

    2015-08-01

    Computer games for a serious purpose - so called serious games can provide additional information for the screening and diagnosis of cognitive impairment. Moreover, they have the advantage of being an ecological tool by involving daily living tasks. However, there is a need for better comprehensive designs regarding the acceptance of this technology, as the target population is older adults that are not used to interact with novel technologies. Moreover given the complexity of the diagnosis and the need for precise assessment, an evaluation of the best approach to analyze the performance data is required. The present study examines the usability of a new screening tool and proposes several new outlines for data analysis.

  10. Discovery of new antimalarial chemotypes through chemical methodology and library development.

    PubMed

    Brown, Lauren E; Chih-Chien Cheng, Ken; Wei, Wan-Guo; Yuan, Pingwei; Dai, Peng; Trilles, Richard; Ni, Feng; Yuan, Jing; MacArthur, Ryan; Guha, Rajarshi; Johnson, Ronald L; Su, Xin-zhuan; Dominguez, Melissa M; Snyder, John K; Beeler, Aaron B; Schaus, Scott E; Inglese, James; Porco, John A

    2011-04-26

    In an effort to expand the stereochemical and structural complexity of chemical libraries used in drug discovery, the Center for Chemical Methodology and Library Development at Boston University has established an infrastructure to translate methodologies accessing diverse chemotypes into arrayed libraries for biological evaluation. In a collaborative effort, the NIH Chemical Genomics Center determined IC(50)'s for Plasmodium falciparum viability for each of 2,070 members of the CMLD-BU compound collection using quantitative high-throughput screening across five parasite lines of distinct geographic origin. Three compound classes displaying either differential or comprehensive antimalarial activity across the lines were identified, and the nascent structure activity relationships (SAR) from this experiment used to initiate optimization of these chemotypes for further development.

  11. Radiological assessment. A textbook on environmental dose analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Till, J.E.; Meyer, H.R.

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. Themore » material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.« less

  12. A Comprehensive Evaluation of H2SO4 formation from OH and sCI pathways in high BVOC environments

    NASA Astrophysics Data System (ADS)

    Kim, S.; Seco, R.; Park, J. H.; Guenther, A. B.; Smith, J. N.; Kuang, C.; Bustillos, J. O. V.; Tota, J.; Souza, R. A. F. D.

    2014-12-01

    The recently highlighted importance of stabilized Criegee intermediates (sCI) as an oxidant for atmospheric SO2 triggered a number of studies to assess the atmospheric implications of H2SO4 formation from the sCI reaction pathway. In addition, it has not been clear why new particle formation events are not observed in the Amazon rain forest. The mostly widely speculated reason has been a very low H2SO4 level. We will present quantitative assessments of SO2 oxidation by sCI leading to the H2SO4 production using a comprehensive observational dataset from a tropical rainforest study during the GOAmazon field campaign at the T3 site in Manacapuru, Amazonas, Brazil. To our best knowledge, this is the first observation of H2SO4 and OH in Amazon and is unique for all tropical sites due to the accompanying comprehensive gas and aerosol observations such as CO, NOX, SO2, VOCs, and physical and chemical characteristics of aerosols. We will discuss observed H2SO4 levels during the GOAmazon field campaigns to demonstrate 1) H2SO4 formation potential from OH and sCI oxidation pathways by contrasting extremely clean and relatively polluted air masses and 2) the Implications of the observed H2SO4 levels in new particle formation and particle growth events.

  13. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data

    PubMed Central

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie

    2018-01-01

    Abstract Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. PMID:29106630

  14. Comprehensive health evaluation of workers in the ceramics industry.

    PubMed Central

    Huang, J; Shibata, E; Takeuchi, Y; Okutani, H

    1993-01-01

    A cross sectional study on ceramics workers in the Seto area of Japan was conducted to determine whether there are differences in the morbidity pattern and overall health among the working populations employed by different sized companies engaged in the ceramics industry. The study population consisted of 3324 male ceramics workers (age range 40-69) who participated in the Seto occupational health screening programme in 1990. The prevalences of pulmonary diseases (silicosis and tuberculosis) and findings of some nonpulmonary diseases were compared in terms of company size by categories of 1-19, 20-49, 50-99, and 100 or more employees. The values of various screening tests were subjected to principle component analysis to extract factors representing key indices of health state. The average factor scores stratified by company size were used to quantitatively evaluate the comprehensive health level. Both prevalence and multivariate analysis showed that the employees working for smaller companies had overall worse health. The smaller the company, the higher the prevalence and the lower the health scores for silicosis and pulmonary tuberculosis tended to be; significantly increased rates and lower health scores for hypertension, anaemia, and glucosuria among small companies were also found when compared with larger companies. The high morbidity of silicosis and pulmonary tuberculosis in smaller companies contributed most to the decline in the overall health level. PMID:8435343

  15. A Comprehensive, Multi-modal Evaluation of the Assessment System of an Undergraduate Research Methodology Course: Translating Theory into Practice.

    PubMed

    Mohammad Abdulghani, Hamza; G Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair

    2014-03-01

    To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. RESULTS of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items.

  16. A Comprehensive, Multi-modal Evaluation of the Assessment System of an Undergraduate Research Methodology Course: Translating Theory into Practice

    PubMed Central

    Mohammad Abdulghani, Hamza; G. Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair

    2014-01-01

    Objective: To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Methods: Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. Results of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Results: Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. Conclusions: The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items. PMID:24772117

  17. Multi-indicator Evaluation System for Broadsword, Rod, Sword and Spear Athletes Based on Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Luo, Lin

    2017-08-01

    In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.

  18. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis

    PubMed Central

    Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.

    2017-01-01

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712

  19. A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.

    PubMed

    Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P

    2017-04-07

    Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.

  20. A Comparative Study of Hawaii Middle School Science Student Academic Achievement

    NASA Astrophysics Data System (ADS)

    Askew Cain, Peggy

    The problem was middle-grade students with specific learning disabilities (SWDs) in reading comprehension perform less well than their peers on standardized assessments. The purpose of this quantitative comparative study was to examine the effect of electronic concept maps on reading comprehension of eighth grade students with SWD reading comprehension in a Hawaii middle school Grade 8 science class on the island of Oahu. The target population consisted of Grade 8 science students for school year 2015-2016. The sampling method was a purposeful sampling with a final sample size of 338 grade 8 science students. De-identified archival records of grade 8 Hawaii standardized science test scores were analyzed using a one way analysis of variance (ANOVA) in SPSS. The finding for hypothesis 1 indicated a significant difference in student achievement between SWDs and SWODs as measured by Hawaii State Assessment (HSA) in science scores (p < 0.05), and for hypothesis 2, a significant difference in instructional modality for SWDs who used concept maps and does who did not as measured by the Hawaii State Assessment in science (p < 0.05). The implications of the findings (a) SWDs performed less well in science achievement than their peers and consequently, and (b) SWODs appeared to remember greater degrees of science knowledge, and answered more questions correctly than SWDs as a result of reading comprehension. Recommendations for practice were for educational leadership and noted: (a) teachers should practice using concept maps with SWDs as a specific reading strategy to support reading comprehension in science classes, (b) involve a strong focus on vocabulary building and concept building during concept map construction because the construction of concept maps sometimes requires frontloading of vocabulary, and (c) model for teachers how concept maps are created and to explain their educational purpose as a tool for learning. Recommendations for future research were to conduct (a) a quantitative comparative study between groups for academic achievement of subtests mean scores of SWDs and SWODs in physical science, earth science, and space science, and (b) a quantitative correlation study to examine relationships and predictive values for academic achievement of SWDs and concept map integration on standardized science assessments.

  1. A comparative analysis of human plasma and serum proteins by combining native PAGE, whole-gel slicing and quantitative LC-MS/MS: Utilizing native MS-electropherograms in proteomic analysis for discovering structure and interaction-correlated differences.

    PubMed

    Wen, Meiling; Jin, Ya; Manabe, Takashi; Chen, Shumin; Tan, Wen

    2017-12-01

    MS identification has long been used for PAGE-separated protein bands, but global and systematic quantitation utilizing MS after PAGE has remained rare and not been reported for native PAGE. Here we reported on a new method combining native PAGE, whole-gel slicing and quantitative LC-MS/MS, aiming at comparative analysis on not only abundance, but also structures and interactions of proteins. A pair of human plasma and serum samples were used as test samples and separated on a native PAGE gel. Six lanes of each sample were cut, each lane was further sliced into thirty-five 1.1 mm × 1.1 mm squares and all the squares were subjected to standardized procedures of in-gel digestion and quantitative LC-MS/MS. The results comprised 958 data rows that each contained abundance values of a protein detected in one square in eleven gel lanes (one plasma lane excluded). The data were evaluated to have satisfactory reproducibility of assignment and quantitation. Totally 315 proteins were assigned, with each protein assigned in 1-28 squares. The abundance distributions in the plasma and serum gel lanes were reconstructed for each protein, named as "native MS-electropherograms". Comparison of the electropherograms revealed significant plasma-versus-serum differences on 33 proteins in 87 squares (fold difference > 2 or < 0.5, p < 0.05). Many of the differences matched with accumulated knowledge on protein interactions and proteolysis involved in blood coagulation, complement and wound healing processes. We expect this method would be useful to provide more comprehensive information in comparative proteomic analysis, on both quantities and structures/interactions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Comprehensive two-dimensional gas chromatography and food sensory properties: potential and challenges.

    PubMed

    Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2015-01-01

    Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.

  3. Research on Comprehensive Evaluation Method for Heating Project Based on Analytic Hierarchy Processing

    NASA Astrophysics Data System (ADS)

    Han, Shenchao; Yang, Yanchun; Liu, Yude; Zhang, Peng; Li, Siwei

    2018-01-01

    It is effective to reduce haze in winter by changing the distributed heat supply system. Thus, the studies on comprehensive index system and scientific evaluation method of distributed heat supply project are essential. Firstly, research the influence factors of heating modes, and an index system with multiple dimension including economic, environmental, risk and flexibility was built and all indexes were quantified. Secondly, a comprehensive evaluation method based on AHP was put forward to analyze the proposed multiple and comprehensive index system. Lastly, the case study suggested that supplying heat with electricity has great advantage and promotional value. The comprehensive index system of distributed heating supply project and evaluation method in this paper can evaluate distributed heat supply project effectively and provide scientific support for choosing the distributed heating project.

  4. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    PubMed

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  5. Emotions predictably modify response times in the initiation of human motor actions: A meta-analytic review.

    PubMed

    Beatty, Garrett F; Cranley, Nicole M; Carnaby, Giselle; Janelle, Christopher M

    2016-03-01

    Emotions motivate individuals to attain appetitive goals and avoid aversive consequences. Empirical investigations have detailed how broad approach and avoidance orientations are reflected in fundamental movement attributes such as the speed, accuracy, and variability of motor actions. Several theoretical perspectives propose explanations for how emotional states influence the speed with which goal directed movements are initiated. These perspectives include biological predisposition, muscle activation, distance regulation, cognitive evaluation, and evaluative response coding accounts. A comprehensive review of literature and meta-analysis were undertaken to quantify empirical support for these theoretical perspectives. The systematic review yielded 34 studies that contained 53 independent experiments producing 128 effect sizes used to evaluate the predictions of existing theories. The central tenets of the biological predisposition (Hedges' g = -0.356), distance regulation (g = -0.293; g = 0.243), and cognitive evaluation (g = -0.249; g = -0.405; g = -0.174) accounts were supported. Partial support was also identified for the evaluative response coding (g = -0.255) framework. Our findings provide quantitative evidence that substantiate existing theoretical perspectives, and provide potential direction for conceptual integration of these independent perspectives. Recommendations for future empirical work in this area are discussed. (c) 2016 APA, all rights reserved).

  6. Novel online monitoring and alert system for anaerobic digestion reactors.

    PubMed

    Dong, Fang; Zhao, Quan-Bao; Li, Wen-Wei; Sheng, Guo-Ping; Zhao, Jin-Bao; Tang, Yong; Yu, Han-Qing; Kubota, Kengo; Li, Yu-You; Harada, Hideki

    2011-10-15

    Effective monitoring and diagnosis of anaerobic digestion processes is a great challenge for anaerobic digestion reactors, which limits their stable operation. In this work, an online monitoring and alert system for upflow anaerobic sludge blanket (UASB) reactors is developed on the basis of a set of novel evaluating indexes. The two indexes, i.e., stability index S and auxiliary index a, which incorporate both gas- and liquid-phase parameters for UASB, enable a quantitative and comprehensive evaluation of reactor status. A series of shock tests is conducted to evaluate the response of the monitoring and alert system to organic overloading, hydraulic, temperature, and toxicant shocks. The results show that this system enables an accurate and rapid monitoring and diagnosis of the reactor status, and offers reliable early warnings on the potential risks. As the core of this system, the evaluating indexes are demonstrated to be of high accuracy and sensitivity in process evaluation and good adaptability to the artificial intelligence and automated control apparatus. This online monitoring and alert system presents a valuable effort to promote the automated monitoring and control of anaerobic digestion process, and holds a high promise for application.

  7. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  8. Quantitative analysis of ecological effects for land use planning based on ecological footprint method: a case research in Nanyang City

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Liu, Yaolin; Chen, Xinming

    2008-10-01

    The research of coordinated development between land use and ecological building is a new problem with the development of country economy, whose intention is to improve economy development and protect eco-environment in order to realize regional sustainable development. Evaluating human effects on the ecosystem by a comprehensive, scientific and quantitative method is a critical issue in the process of general land use planning. At present, ecological footprint methodology, as an excellent educational tool applicable to global issues, is essential for quantifying humanity's consumption of natural capital, for overall assessments of human impact on earth as well as for general land use planning. However, quantitative studies on the development trends of ecological footprint (EF) time series and biological capacity (BC) time series in a given region are still rare. Taking Nanyang City as a case study, this paper presents two quantitative estimate indices over time scale called the change rate and scissors difference to quantitatively analyze the trends of EF and BC over the planning period in general land use planning form 1997-2004 and to evaluate the ecological effects of the land use general planning form 1997 to.2010. The results showed that: 1 In Nanyang city, trends of the per capita EF and BC were on the way round, and the ecological deficit enhanced from 1997 to 2010. 2 The difference between the two development trends of per capita EF and BC had been increasing rapidly and the conflict between the EF and BC was aggravated from 1997 to 2010. 3 The general land use planning (1997 - 2010) of Nanyang city had produced some positive effects on the local ecosystem, but the expected biological capacity in 2010 can hardly be realized following this trend. Therefore, this paper introduces a "trinity" land use model in the guidelines of environment- friendly land use pattern and based on the actual situation of Nanyang city, with the systemic synthesis of land utilization of the cities, the village and the suburb as a principal part and the land development reorganization and the ecological environment construction as the key point.

  9. Visualization of Data Regarding Infections Using Eye Tracking Techniques

    PubMed Central

    Yoon, Sunmoo; Cohen, Bevin; Cato, Kenrick D.; Liu, Jianfang; Larson, Elaine L.

    2016-01-01

    Objective To evaluate ease of use and usefulness for nurses of visualizations of infectious disease transmission in a hospital. Design An observational study was used to evaluate perceptions of several visualizations of data extracted from electronic health records designed using a participatory approach. Twelve nurses in the master’s program in an urban research-intensive nursing school participated in May 2015. Methods A convergent parallel mixed method was used to evaluate nurses’ perceptions on ease of use and usefulness of five visualization conveying trends in hospital infection transmission applying think-aloud, interview, and eye-tracking techniques. Findings Subjective data from the interview and think-aloud techniques indicated that participants preferred the traditional line graphs in simple data representation due to their familiarity, clarity, and easiness to read. An objective quantitative measure of eye movement analysis (444,421 gaze events) identified a high degree of participants’ attention span in infographics in all three scenarios. All participants responded with the correct answer within 1 min in comprehensive tests. Conclusions A user-centric approach was effective in developing and evaluating visualizations for hospital infection transmission. For the visualizations designed by the users, the participants were easily able to comprehend the infection visualizations on both line graphs and infographics for simple visualization. The findings from the objective comprehension test and eye movement and subjective attitudes support the feasibility of integrating user-centric visualization designs into electronic health records, which may inspire clinicians to be mindful of hospital infection transmission. Future studies are needed to investigate visualizations and motivation, and the effectiveness of visualization on infection rate. Clinical Relevance This study designed visualization images using clinical data from electronic health records applying a user-centric approach. The design insights can be applied for visualizing patient data in electronic health records. PMID:27061619

  10. Visualization of Data Regarding Infections Using Eye Tracking Techniques.

    PubMed

    Yoon, Sunmoo; Cohen, Bevin; Cato, Kenrick D; Liu, Jianfang; Larson, Elaine L

    2016-05-01

    To evaluate ease of use and usefulness for nurses of visualizations of infectious disease transmission in a hospital. An observational study was used to evaluate perceptions of several visualizations of data extracted from electronic health records designed using a participatory approach. Twelve nurses in the master's program in an urban research-intensive nursing school participated in May 2015. A convergent parallel mixed method was used to evaluate nurses' perceptions on ease of use and usefulness of five visualization conveying trends in hospital infection transmission applying think-aloud, interview, and eye-tracking techniques. Subjective data from the interview and think-aloud techniques indicated that participants preferred the traditional line graphs in simple data representation due to their familiarity, clarity, and easiness to read. An objective quantitative measure of eye movement analysis (444,421 gaze events) identified a high degree of participants' attention span in infographics in all three scenarios. All participants responded with the correct answer within 1 min in comprehensive tests. A user-centric approach was effective in developing and evaluating visualizations for hospital infection transmission. For the visualizations designed by the users, the participants were easily able to comprehend the infection visualizations on both line graphs and infographics for simple visualization. The findings from the objective comprehension test and eye movement and subjective attitudes support the feasibility of integrating user-centric visualization designs into electronic health records, which may inspire clinicians to be mindful of hospital infection transmission. Future studies are needed to investigate visualizations and motivation, and the effectiveness of visualization on infection rate. This study designed visualization images using clinical data from electronic health records applying a user-centric approach. The design insights can be applied for visualizing patient data in electronic health records. © 2016 Sigma Theta Tau International.

  11. Disaster metrics: quantification of acute medical disasters in trauma-related multiple casualty events through modeling of the Acute Medical Severity Index.

    PubMed

    Bayram, Jamil D; Zuabi, Shawki

    2012-04-01

    The interaction between the acute medical consequences of a Multiple Casualty Event (MCE) and the total medical capacity of the community affected determines if the event amounts to an acute medical disaster. There is a need for a comprehensive quantitative model in MCE that would account for both prehospital and hospital-based acute medical systems, leading to the quantification of acute medical disasters. Such a proposed model needs to be flexible enough in its application to accommodate a priori estimation as part of the decision-making process and a posteriori evaluation for total quality management purposes. The concept proposed by de Boer et al in 1989, along with the disaster metrics quantitative models proposed by Bayram et al on hospital surge capacity and prehospital medical response, were used as theoretical frameworks for a new comprehensive model, taking into account both prehospital and hospital systems, in order to quantify acute medical disasters. A quantitative model called the Acute Medical Severity Index (AMSI) was developed. AMSI is the proportion of the Acute Medical Burden (AMB) resulting from the event, compared to the Total Medical Capacity (TMC) of the community affected; AMSI = AMB/TMC. In this model, AMB is defined as the sum of critical (T1) and moderate (T2) casualties caused by the event, while TMC is a function of the Total Hospital Capacity (THC) and the medical rescue factor (R) accounting for the hospital-based and prehospital medical systems, respectively. Qualitatively, the authors define acute medical disaster as "a state after any type of Multiple Casualty Event where the Acute Medical Burden (AMB) exceeds the Total Medical Capacity (TMC) of the community affected." Quantitatively, an acute medical disaster has an AMSI value of more than one (AMB / TMC > 1). An acute medical incident has an AMSI value of less than one, without the need for medical surge. An acute medical emergency has an AMSI value of less than one with utilization of surge capacity (prehospital or hospital-based). An acute medical crisis has an AMSI value between 0.9 and 1, approaching the threshold for an actual medical disaster. A novel quantitative taxonomy in MCE has been proposed by modeling the Acute Medical Severity Index (AMSI). This model accounts for both hospital and prehospital systems, and quantifies acute medical disasters. Prospective applications of various components of this model are encouraged to further verify its applicability and validity.

  12. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  13. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  14. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  15. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  16. A quality assurance phantom for the performance evaluation of volumetric micro-CT systems

    NASA Astrophysics Data System (ADS)

    Du, Louise Y.; Umoh, Joseph; Nikolov, Hristo N.; Pollmann, Steven I.; Lee, Ting-Yim; Holdsworth, David W.

    2007-12-01

    Small-animal imaging has recently become an area of increased interest because more human diseases can be modeled in transgenic and knockout rodents. As a result, micro-computed tomography (micro-CT) systems are becoming more common in research laboratories, due to their ability to achieve spatial resolution as high as 10 µm, giving highly detailed anatomical information. Most recently, a volumetric cone-beam micro-CT system using a flat-panel detector (eXplore Ultra, GE Healthcare, London, ON) has been developed that combines the high resolution of micro-CT and the fast scanning speed of clinical CT, so that dynamic perfusion imaging can be performed in mice and rats, providing functional physiological information in addition to anatomical information. This and other commercially available micro-CT systems all promise to deliver precise and accurate high-resolution measurements in small animals. However, no comprehensive quality assurance phantom has been developed to evaluate the performance of these micro-CT systems on a routine basis. We have designed and fabricated a single comprehensive device for the purpose of performance evaluation of micro-CT systems. This quality assurance phantom was applied to assess multiple image-quality parameters of a current flat-panel cone-beam micro-CT system accurately and quantitatively, in terms of spatial resolution, geometric accuracy, CT number accuracy, linearity, noise and image uniformity. Our investigations show that 3D images can be obtained with a limiting spatial resolution of 2.5 mm-1 and noise of ±35 HU, using an acquisition interval of 8 s at an entrance dose of 6.4 cGy.

  17. Validation of a Quantitative Single-Subject Based Evaluation for Rehabilitation-Induced Improvement Assessment.

    PubMed

    Gandolla, Marta; Molteni, Franco; Ward, Nick S; Guanziroli, Eleonora; Ferrigno, Giancarlo; Pedrocchi, Alessandra

    2015-11-01

    The foreseen outcome of a rehabilitation treatment is a stable improvement on the functional outcomes, which can be longitudinally assessed through multiple measures to help clinicians in functional evaluation. In this study, we propose an automatic comprehensive method of combining multiple measures in order to assess a functional improvement. As test-bed, a functional electrical stimulation based treatment for foot drop correction performed with chronic post-stroke participants is presented. Patients were assessed on five relevant outcome measures before, after intervention, and at a follow-up time-point. A novel algorithm based on variables minimum detectable change is proposed and implemented in a custom-made software, combining the outcome measures to obtain a unique parameter: capacity score. The difference between capacity scores at different timing is three holded to obtain improvement evaluation. Ten clinicians evaluated patients on the Improvement Clinical Global Impression scale. Eleven patients underwent the treatment, and five resulted to achieve a stable functional improvement, as assessed by the proposed algorithm. A statistically significant agreement between intra-clinicians and algorithm-clinicians evaluations was demonstrated. The proposed method evaluates functional improvement on a single-subject yes/no base by merging different measures (e.g., kinematic, muscular) and it is validated against clinical evaluation.

  18. Reading Argumentative Texts: Comprehension and Evaluation Goals and Outcomes

    ERIC Educational Resources Information Center

    Diakidoy, Irene-Anna N.; Ioannou, Melina C.; Christodoulou, Stelios A.

    2017-01-01

    The study is situated at the interface between reading comprehension and critical thinking research. Its purpose was to examine the influence of reading goals and argument quality on the comprehension and critical evaluation of argumentative texts. Young adult readers read to comprehend or evaluate texts on two different controversial issues.…

  19. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  20. Longitudinal Assessment of Comprehensive School Reform Program Implementation and Outcomes: First-Year Report

    ERIC Educational Resources Information Center

    Tushnet, Naida C., Flaherty, John, Jr., Smith, And

    2004-01-01

    The Longitudinal Assessment of Comprehensive School Reform Implementation and Outcomes (LACIO) responds to the No Child Left Behind Act's requirement for an evaluation of the federal Comprehensive School Reform (CSR) program. The legislation stipulates two broad goals for the evaluation: (1) to evaluate the implementation and outcomes achieved by…

  1. Multiscale multimodal fusion of histological and MRI volumes for characterization of lung inflammation

    NASA Astrophysics Data System (ADS)

    Rusu, Mirabela; Wang, Haibo; Golden, Thea; Gow, Andrew; Madabhushi, Anant

    2013-03-01

    Mouse lung models facilitate the investigation of conditions such as chronic inflammation which are associated with common lung diseases. The multi-scale manifestation of lung inflammation prompted us to use multi-scale imaging - both in vivo, ex vivo MRI along with ex vivo histology, for its study in a new quantitative way. Some imaging modalities, such as MRI, are non-invasive and capture macroscopic features of the pathology, while others, e.g. ex vivo histology, depict detailed structures. Registering such multi-modal data to the same spatial coordinates will allow the construction of a comprehensive 3D model to enable the multi-scale study of diseases. Moreover, it may facilitate the identification and definition of quantitative of in vivo imaging signatures for diseases and pathologic processes. We introduce a quantitative, image analytic framework to integrate in vivo MR images of the entire mouse with ex vivo histology of the lung alone, using lung ex vivo MRI as conduit to facilitate their co-registration. In our framework, we first align the MR images by registering the in vivo and ex vivo MRI of the lung using an interactive rigid registration approach. Then we reconstruct the 3D volume of the ex vivo histological specimen by efficient group wise registration of the 2D slices. The resulting 3D histologic volume is subsequently registered to the MRI volumes by interactive rigid registration, directly to the ex vivo MRI, and implicitly to in vivo MRI. Qualitative evaluation of the registration framework was performed by comparing airway tree structures in ex vivo MRI and ex vivo histology where airways are visible and may be annotated. We present a use case for evaluation of our co-registration framework in the context of studying chronic inammation in a diseased mouse.

  2. Combining U.S.-based prioritization tools to improve screening level accountability for environmental impact: the case of the chemical manufacturing industry.

    PubMed

    Zhou, Xiaoying; Schoenung, Julie M

    2009-12-15

    There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.

  3. Developing a performance measurement approach to benefit/cost freight project prioritization.

    DOT National Transportation Integrated Search

    2014-10-01

    Future reauthorizations of the federal transportation bill will require a comprehensive and quantitative analysis of the freight benefits : of proposed freight system projects. To prioritize public investments in freight systems and to insure conside...

  4. Comprehensive two-dimensional gas chromatography with flame ionization and time-of-flight mass spectrometry detection: qualitative and quantitative analysis of West Australian sandalwood oil.

    PubMed

    Shellie, Robert; Marriott, Philip; Morrison, Paul

    2004-09-01

    The use of gas chromatography (GC)-mass spectrometry (MS), GC-time-of-flight MS (TOFMS), comprehensive two-dimensional GC (GCxGC)-flame ionization detection (FID), and GCxGC-TOFMS is discussed for the characterization of the eight important representative components, including Z-alpha-santalol, epi-alpha-bisabolol, Z-alpha-trans-bergamotol, epi-beta-santalol, Z-beta-santalol, E,E-farnesol, Z-nuciferol, and Z-lanceol, in the oil of west Australian sandalwood (Santalum spicatum). Single-column GC-MS lacks the resolving power to separate all of the listed components as pure peaks and allow precise analytical measurement of individual component abundances. With enhanced peak resolution capabilities in GCxGC, these components are sufficiently well resolved to be quantitated using flame ionization detection, following initial characterization of components by using GCxGC-TOFMS.

  5. Comprehensive and Quantitative Proteomic Analysis of Metamorphosis-Related Proteins in the Veined Rapa Whelk, Rapana venosa.

    PubMed

    Song, Hao; Wang, Hai-Yan; Zhang, Tao

    2016-06-15

    Larval metamorphosis of the veined rapa whelk (Rapana venosa) is a pelagic to benthic transition that involves considerable structural and physiological changes. Because metamorphosis plays a pivotal role in R. venosa commercial breeding and natural populations, the endogenous proteins that drive this transition attract considerable interest. This study is the first to perform a comprehensive and quantitative proteomic analysis related to metamorphosis in a marine gastropod. We analyzed the proteomes of competent R. venosa larvae and post-larvae, resulting in the identification of 5312 proteins, including 470 that were downregulated and 668 that were upregulated after metamorphosis. The differentially expressed proteins reflected multiple processes involved in metamorphosis, including cytoskeleton and cell adhesion, ingestion and digestion, stress response and immunity, as well as specific tissue development. Our data improve understanding of the physiological traits controlling R. venosa metamorphosis and provide a solid basis for further study.

  6. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    PubMed Central

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  7. A New Method to Comprehensively Diagnose Shock Waves in the Solar Atmosphere Based on Simultaneous Spectroscopic and Imaging Observations

    NASA Astrophysics Data System (ADS)

    Ruan, Wenzhi; Yan, Limei; He, Jiansen; Zhang, Lei; Wang, Linghua; Wei, Yong

    2018-06-01

    Shock waves are believed to play an important role in plasma heating. The shock-like temporal jumps in radiation intensity and Doppler shift have been identified in the solar atmosphere. However, a quantitative diagnosis of the shocks in the solar atmosphere is still lacking, seriously hindering the understanding of shock dissipative heating of the solar atmosphere. Here, we propose a new method to realize the goal of the shock quantitative diagnosis, based on Rankine–Hugoniot equations and taking the advantages of simultaneous imaging and spectroscopic observations from, e.g., IRIS (Interface Region Imaging Spectrograph). Because of this method, the key parameters of shock candidates can be derived, such as the bulk velocity and temperature of the plasma in the upstream and downstream, the propagation speed and direction. The method is applied to the shock candidates observed by IRIS, and the overall characteristics of the shocks are revealed quantitatively for the first time. This method is also tested with the help of forward modeling, i.e., virtual observations of simulated shocks. The parameters obtained from the method are consistent with the parameters of the shock formed in the model and are independent of the viewing direction. Therefore, the method we proposed here is applicable to the quantitative and comprehensive diagnosis of the observed shocks in the solar atmosphere.

  8. Nuclear medicine and imaging research. Quantitative studies in radiopharmaceutical science. Comprehensive progress report, January 1, 1983-June 30, 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1985-09-01

    This comprehensive report outlines the progress made during the past three years in the areas described below. In all instances, initial studies have been carried out and the technical feasibility of carrying through each study has been demonstrated. The studies described include development of cesium-130 and bromine-75 radioisotope generators, the feasibility of using rubidium-82 as a myocardial imaging agent, and radiochemical preparation of C-11 deoxyglucose. 28 refs. (DT)

  9. A CS1 pedagogical approach to parallel thinking

    NASA Astrophysics Data System (ADS)

    Rague, Brian William

    Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.

  10. Electron Paramagnetic Resonance Oximetry as a Quantitative Method to Measure Cellular Respiration: A Consideration of Oxygen Diffusion Interference

    PubMed Central

    Presley, Tennille; Kuppusamy, Periannan; Zweier, Jay L.; Ilangovan, Govindasamy

    2006-01-01

    Electron paramagnetic resonance (EPR) oximetry is being widely used to measure the oxygen consumption of cells, mitochondria, and submitochondrial particles. However, further improvement of this technique, in terms of data analysis, is required to use it as a quantitative tool. Here, we present a new approach for quantitative analysis of cellular respiration using EPR oximetry. The course of oxygen consumption by cells in suspension has been observed to have three distinct zones: pO2-independent respiration at higher pO2 ranges, pO2-dependent respiration at low pO2 ranges, and a static equilibrium with no change in pO2 at very low pO2 values. The approach here enables one to comprehensively analyze all of the three zones together—where the progression of O2 diffusion zones around each cell, their overlap within time, and their potential impact on the measured pO2 data are considered. The obtained results agree with previously established methods such as high-resolution respirometry measurements. Additionally, it is also demonstrated how the diffusion limitations can depend on cell density and consumption rate. In conclusion, the new approach establishes a more accurate and meaningful model to evaluate the EPR oximetry data on cellular respiration to quantify related parameters using EPR oximetry. PMID:17012319

  11. Quantitative proteomics-based analysis supports a significant role of GTG proteins in regulation of ABA response in Arabidopsis roots.

    PubMed

    Alvarez, Sophie; Roy Choudhury, Swarup; Hicks, Leslie M; Pandey, Sona

    2013-03-01

    Abscisic acid (ABA) is proposed to be perceived by multiple receptors in plants. We have previously reported on the role of two GPCR-type G-proteins (GTG proteins) as plasma membrane-localized ABA receptors in Arabidopsis thaliana. However, due to the presence of multiple transmembrane domains, detailed structural and biochemical characterization of GTG proteins remains limited. Since ABA induces substantial changes in the proteome of plants, a labeling LC-based quantitative proteomics approach was applied to elucidate the global effects and possible downstream targets of GTG1/GTG2 proteins. Quantitative differences in protein abundance between wild-type and gtg1gtg2 were analyzed for evaluation of the effect of ABA on the root proteome and its dependence on the presence of functional GTG1/GTG2 proteins. The results presented in this study reveal the most comprehensive ABA-responsive root proteome reported to date in Arabidopsis. Notably, the majority of ABA-responsive proteins required the presence of GTG proteins, supporting their key role in ABA signaling. These observations were further confirmed by additional experiments. Overall, comparison of the ABA-dependent protein abundance changes in wild-type versus gtg1gtg2 provides clues to their possible links with some of the well-established effectors of the ABA signaling pathways and their role in mediating phytohormone cross-talk.

  12. Predicting perceived visual complexity of abstract patterns using computational measures: The influence of mirror symmetry on complexity perception

    PubMed Central

    Leder, Helmut

    2017-01-01

    Visual complexity is relevant for many areas ranging from improving usability of technical displays or websites up to understanding aesthetic experiences. Therefore, many attempts have been made to relate objective properties of images to perceived complexity in artworks and other images. It has been argued that visual complexity is a multidimensional construct mainly consisting of two dimensions: A quantitative dimension that increases complexity through number of elements, and a structural dimension representing order negatively related to complexity. The objective of this work is to study human perception of visual complexity utilizing two large independent sets of abstract patterns. A wide range of computational measures of complexity was calculated, further combined using linear models as well as machine learning (random forests), and compared with data from human evaluations. Our results confirm the adequacy of existing two-factor models of perceived visual complexity consisting of a quantitative and a structural factor (in our case mirror symmetry) for both of our stimulus sets. In addition, a non-linear transformation of mirror symmetry giving more influence to small deviations from symmetry greatly increased explained variance. Thus, we again demonstrate the multidimensional nature of human complexity perception and present comprehensive quantitative models of the visual complexity of abstract patterns, which might be useful for future experiments and applications. PMID:29099832

  13. Ramifications of increased training in quantitative methodology.

    PubMed

    Zimiles, Herbert

    2009-01-01

    Comments on the article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America" by Aiken, West, and Millsap. The current author asks three questions that are provoked by the comprehensive identification of gaps and deficiencies in the training of quantitative methodology that led Aiken, West, and Millsap to call for expanded graduate instruction resources and programs. This comment calls for greater attention to how advances and expansion in the training of quantitative analysis are influencing who chooses to study psychology and how and what will be studied. PsycINFO Database Record 2009 APA.

  14. System integration of wind and solar power in integrated assessment models: A cross-model evaluation of new approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel

    Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less

  15. ICan: An Optimized Ion-Current-Based Quantification Procedure with Enhanced Quantitative Accuracy and Sensitivity in Biomarker Discovery

    PubMed Central

    2015-01-01

    The rapidly expanding availability of high-resolution mass spectrometry has substantially enhanced the ion-current-based relative quantification techniques. Despite the increasing interest in ion-current-based methods, quantitative sensitivity, accuracy, and false discovery rate remain the major concerns; consequently, comprehensive evaluation and development in these regards are urgently needed. Here we describe an integrated, new procedure for data normalization and protein ratio estimation, termed ICan, for improved ion-current-based analysis of data generated by high-resolution mass spectrometry (MS). ICan achieved significantly better accuracy and precision, and lower false-positive rate for discovering altered proteins, over current popular pipelines. A spiked-in experiment was used to evaluate the performance of ICan to detect small changes. In this study E. coli extracts were spiked with moderate-abundance proteins from human plasma (MAP, enriched by IgY14-SuperMix procedure) at two different levels to set a small change of 1.5-fold. Forty-five (92%, with an average ratio of 1.71 ± 0.13) of 49 identified MAP protein (i.e., the true positives) and none of the reference proteins (1.0-fold) were determined as significantly altered proteins, with cutoff thresholds of ≥1.3-fold change and p ≤ 0.05. This is the first study to evaluate and prove competitive performance of the ion-current-based approach for assigning significance to proteins with small changes. By comparison, other methods showed remarkably inferior performance. ICan can be broadly applicable to reliable and sensitive proteomic survey of multiple biological samples with the use of high-resolution MS. Moreover, many key features evaluated and optimized here such as normalization, protein ratio determination, and statistical analyses are also valuable for data analysis by isotope-labeling methods. PMID:25285707

  16. Improving contraceptive choice: fidelity of implementation and the gap between effectiveness and efficacy.

    PubMed

    Garbers, Samantha; Flandrick, Kathleen; Bermudez, Dayana; Meserve, Allison; Chiasson, Mary Ann

    2014-11-01

    Interventions to reduce unintended pregnancy through improved contraceptive use are a public health priority. A comprehensive process evaluation of a contraceptive assessment module intervention with demonstrated efficacy was undertaken. The 12-month process evaluation goal was to describe the extent to which the intervention was implemented as intended over time, and to identify programmatic adjustments to improve implementation fidelity. Quantitative and qualitative methods included staff surveys, electronic health record data, usage monitoring, and observations. Fidelity of implementation was low overall (<10% of eligible patients completed the entire module [dose received]). Although a midcourse correction making the module available in clinical areas led to increased dose delivered (23% vs. 30%, chi-square test p = .006), dose received did not increase significantly after this adjustment. Contextual factors including competing organizational and staff priorities and staff buy-in limited the level of implementation and precluded adoption of some strategies such as adjusting patient flow. Using a process evaluation framework enabled the research team to identify and address complexities inherent in effectiveness studies and facilitated the alignment of program and context. © 2014 Society for Public Health Education.

  17. Electromagnetic pulsed thermography for natural cracks inspection

    PubMed Central

    Gao, Yunlai; Tian, Gui Yun; Wang, Ping; Wang, Haitao; Gao, Bin; Woo, Wai Lok; Li, Kongjing

    2017-01-01

    Emerging integrated sensing and monitoring of material degradation and cracks are increasingly required for characterizing the structural integrity and safety of infrastructure. However, most conventional nondestructive evaluation (NDE) methods are based on single modality sensing which is not adequate to evaluate structural integrity and natural cracks. This paper proposed electromagnetic pulsed thermography for fast and comprehensive defect characterization. It hybrids multiple physical phenomena i.e. magnetic flux leakage, induced eddy current and induction heating linking to physics as well as signal processing algorithms to provide abundant information of material properties and defects. New features are proposed using 1st derivation that reflects multiphysics spatial and temporal behaviors to enhance the detection of cracks with different orientations. Promising results that robust to lift-off changes and invariant features for artificial and natural cracks detection have been demonstrated that the proposed method significantly improves defect detectability. It opens up multiphysics sensing and integrated NDE with potential impact for natural understanding and better quantitative evaluation of natural cracks including stress corrosion crack (SCC) and rolling contact fatigue (RCF). PMID:28169361

  18. NMR-based Metabolomics for Cancer Research

    EPA Science Inventory

    Metabolomics is considered as a complementary tool to other omics platforms to provide a snapshot of the cellular biochemistry and physiology taking place at any instant. Metabolmics approaches have been widely used to provide comprehensive and quantitative analyses of the metabo...

  19. Indicators of the environmental impacts of transportation : highway, rail, aviation, and maritime transport

    DOT National Transportation Integrated Search

    1996-10-01

    This document presents quantitative national estimates of the magnitude of transportations impacts on the natural environment. It is the most comprehensive compilation of environmental and transportation data to date. This document addresses all p...

  20. Environmental Sustainability - Including Land and Water Use

    EPA Science Inventory

    Assessments of environmental sustainability can be conducted in many ways with one of the most quantitative methods including Life Cycle Impact Assessment (LCIA). While historically LCIA has included a comprehensive list of impact categories including: ozone depletion, global c...

  1. The Impact of an Interactive Statistics Module on Novices' Development of Scientific Process Skills and Attitudes in a First-Semester Research Foundations Course.

    PubMed

    Marsan, Lynnsay A; D'Arcy, Christina E; Olimpo, Jeffrey T

    2016-12-01

    Evidence suggests that incorporating quantitative reasoning exercises into existent curricular frameworks within the science, technology, engineering, and mathematics (STEM) disciplines is essential for novices' development of conceptual understanding and process skills in these domains. Despite this being the case, such studies acknowledge that students often experience difficulty in applying mathematics in the context of scientific problems. To address this concern, the present study sought to explore the impact of active demonstrations and critical reading exercises on novices' comprehension of basic statistical concepts, including hypothesis testing, experimental design, and interpretation of research findings. Students first engaged in a highly interactive height activity that served to intuitively illustrate normal distribution, mean, standard deviation, and sample selection criteria. To enforce practical applications of standard deviation and p -value, student teams were subsequently assigned a figure from a peer-reviewed primary research article and instructed to evaluate the trustworthiness of the data. At the conclusion of this exercise, students presented their evaluations to the class for open discussion and commentary. Quantitative assessment of pre- and post-module survey data indicated a statistically significant increase both in students' scientific reasoning and process skills and in their self-reported confidence in understanding the statistical concepts presented in the module. Furthermore, data indicated that the majority of students (>85%) found the module both interesting and helpful in nature. Future studies will seek to develop additional, novel exercises within this area and to evaluate the impact of such modules across a variety of STEM and non-STEM contexts.

  2. A systematic study of mitochondrial toxicity of environmental chemicals using quantitative high throughput screening

    PubMed Central

    Attene-Ramos, Matias S.; Huang, Ruili; Sakamuru, Srilatha; Witt, Kristine L.; Beeson, Gyda C.; Shou, Louie; Schnellmann, Rick G.; Beeson, Craig C.; Tice, Raymond R.; Austin, Christopher P.; Xia, Menghang

    2014-01-01

    A goal of the Tox21 program is to transit toxicity testing from traditional in vivo models to in vitro assays that assess how chemicals affect cellular responses and toxicity pathways. A critical contribution of the NIH Chemical Genomics center (NCGC) to the Tox21 program is the implementation of a quantitative high throughput screening (qHTS) approach, using cell- and biochemical-based assays to generate toxicological profiles for thousands of environmental compounds. Here, we evaluated the effect of chemical compounds on mitochondrial membrane potential in HepG2 cells by screening a library of 1,408 compounds provided by the National Toxicology Program (NTP) in a qHTS platform. Compounds were screened over 14 concentrations, and results showed that 91 and 88 compounds disrupted mitochondrial membrane potential after treatment for one or five h, respectively. Seventy-six compounds active at both time points were clustered by structural similarity, producing 11 clusters and 23 singletons. Thirty-eight compounds covering most of the active chemical space were more extensively evaluated. Thirty-six of the 38 compounds were confirmed to disrupt mitochondrial membrane potential using a fluorescence plate reader and 35 were confirmed using a high content imaging approach. Among the 38 compounds, 4 and 6 induced LDH release, a measure of cytotoxicity, at 1 or 5 h, respectively. Compounds were further assessed for mechanism of action (MOA) by measuring changes in oxygen consumption rate, which enabled identification of 20 compounds as uncouplers. This comprehensive approach allows for evaluation of thousands of environmental chemicals for mitochondrial toxicity and identification of possible MOAs. PMID:23895456

  3. The Impact of an Interactive Statistics Module on Novices’ Development of Scientific Process Skills and Attitudes in a First-Semester Research Foundations Course†

    PubMed Central

    Marsan, Lynnsay A.; D’Arcy, Christina E.; Olimpo, Jeffrey T.

    2016-01-01

    Evidence suggests that incorporating quantitative reasoning exercises into existent curricular frameworks within the science, technology, engineering, and mathematics (STEM) disciplines is essential for novices’ development of conceptual understanding and process skills in these domains. Despite this being the case, such studies acknowledge that students often experience difficulty in applying mathematics in the context of scientific problems. To address this concern, the present study sought to explore the impact of active demonstrations and critical reading exercises on novices’ comprehension of basic statistical concepts, including hypothesis testing, experimental design, and interpretation of research findings. Students first engaged in a highly interactive height activity that served to intuitively illustrate normal distribution, mean, standard deviation, and sample selection criteria. To enforce practical applications of standard deviation and p-value, student teams were subsequently assigned a figure from a peer-reviewed primary research article and instructed to evaluate the trustworthiness of the data. At the conclusion of this exercise, students presented their evaluations to the class for open discussion and commentary. Quantitative assessment of pre- and post-module survey data indicated a statistically significant increase both in students’ scientific reasoning and process skills and in their self-reported confidence in understanding the statistical concepts presented in the module. Furthermore, data indicated that the majority of students (>85%) found the module both interesting and helpful in nature. Future studies will seek to develop additional, novel exercises within this area and to evaluate the impact of such modules across a variety of STEM and non-STEM contexts. PMID:28101271

  4. Integrated Research on the Development of Global Climate Risk Management Strategies - Framework and Initial Results of the Research Project ICA-RUS

    NASA Astrophysics Data System (ADS)

    Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko

    2013-04-01

    With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.

  5. [Medical terminology and lay users. A quali-quantitative survey of a group of young motivated graduates].

    PubMed

    Conti, A A

    2013-01-01

    Medical terms occupy growing spaces in dictionaries and the media daily propose a great number of medical words. Nevertheless scientific data regarding the actual degree of comprehension of medical terminology on the part of lay users are scanty. Aim of this study was the evaluation, in a group of young motivated graduates, of the degree of understanding of a set of medical terms normally adopted by physicians in specialistic language, and also used when speaking with patients. Nine medical terms used by physicians in daily practice were selected (“aphasia”, “edema”, “erythema”, “fibrillation”, “fibroma”, “jaundice”, “paraplegia”, “polypus”, “sclerosis”) and they were administered in paper form to eighteen young graduates, non-health operators who were asked to furnish one definition for each of the terms. A subsequent structured oral discussion integrated the recorded written findings. Erythema and fibrillation were the most well-known and understood terms. Among the selected medical terms, the more difficult ones to understand were sclerosis and jaundice. Interesting features emerged from the characterization of the site attributed to some of the investigated terms, in particular edema was mainly perceived as the pulmonary one, fibroma was more often interpreted as a benign tumor localized in the uterus and polypus was more frequently associated with its collocation in the nose. The participants involved in this quali-quantitative survey demonstrated a general good knowledge and comprehension of the medical terms proposed. Some limits in knowledge documented in this group, however, indicate that the use of medical terminology needs more clarification within the doctor-patient context. Such clarification appears even more mandatory in subjects with low scholastic-education levels.

  6. A Decade in the MIST: Learnings from Investigations of Drug Metabolites in Drug Development under the "Metabolites in Safety Testing" Regulatory Guidance.

    PubMed

    Schadt, Simone; Bister, Bojan; Chowdhury, Swapan K; Funk, Christoph; Hop, Cornelis E C A; Humphreys, W Griffith; Igarashi, Fumihiko; James, Alexander D; Kagan, Mark; Khojasteh, S Cyrus; Nedderman, Angus N R; Prakash, Chandra; Runge, Frank; Scheible, Holger; Spracklin, Douglas K; Swart, Piet; Tse, Susanna; Yuan, Josh; Obach, R Scott

    2018-06-01

    Since the introduction of metabolites in safety testing (MIST) guidance by the Food and Drug Administration in 2008, major changes have occurred in the experimental methods for the identification and quantification of metabolites, ways to evaluate coverage of metabolites, and the timing of critical clinical and nonclinical studies to generate this information. In this cross-industry review, we discuss how the increased focus on human drug metabolites and their potential contribution to safety and drug-drug interactions has influenced the approaches taken by industry for the identification and quantitation of human drug metabolites. Before the MIST guidance was issued, the method of choice for generating comprehensive metabolite profile was radio chromatography. The MIST guidance increased the focus on human drug metabolites and their potential contribution to safety and drug-drug interactions and led to changes in the practices of drug metabolism scientists. In addition, the guidance suggested that human metabolism studies should also be accelerated, which has led to more frequent determination of human metabolite profiles from multiple ascending-dose clinical studies. Generating a comprehensive and quantitative profile of human metabolites has become a more urgent task. Together with technological advances, these events have led to a general shift of focus toward earlier human metabolism studies using high-resolution mass spectrometry and to a reduction in animal radiolabel absorption/distribution/metabolism/excretion studies. The changes induced by the MIST guidance are highlighted by six case studies included herein, reflecting different stages of implementation of the MIST guidance within the pharmaceutical industry. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.

  7. Long-term prognostic value of a comprehensive assessment of cardiac magnetic resonance indexes after an ST-segment elevation myocardial infarction.

    PubMed

    Merlos, Pilar; López-Lereu, Maria P; Monmeneu, Jose V; Sanchis, Juan; Núñez, Julio; Bonanad, Clara; Valero, Ernesto; Miñana, Gema; Chaustre, Fabián; Gómez, Cristina; Oltra, Ricardo; Palacios, Lorena; Bosch, Maria J; Navarro, Vicente; Llácer, Angel; Chorro, Francisco J; Bodí, Vicente

    2013-08-01

    A variety of cardiac magnetic resonance indexes predict mid-term prognosis in ST-segment elevation myocardial infarction patients. The extent of transmural necrosis permits simple and accurate prediction of systolic recovery. However, its long-term prognostic value beyond a comprehensive clinical and cardiac magnetic resonance evaluation is unknown. We hypothesized that a simple semiquantitative assessment of the extent of transmural necrosis is the best resonance index to predict long-term outcome soon after a first ST-segment elevation myocardial infarction. One week after a first ST-segment elevation myocardial infarction we carried out a comprehensive quantification of several resonance parameters in 206 consecutive patients. A semiquantitative assessment (altered number of segments in the 17-segment model) of edema, baseline and post-dobutamine wall motion abnormalities, first pass perfusion, microvascular obstruction, and the extent of transmural necrosis was also performed. During follow-up (median 51 months), 29 patients suffered a major adverse cardiac event (8 cardiac deaths, 11 nonfatal myocardial infarctions, and 10 readmissions for heart failure). Major cardiac events were associated with more severely altered quantitative and semiquantitative resonance indexes. After a comprehensive multivariate adjustment, the extent of transmural necrosis was the only resonance index independently related to the major cardiac event rate (hazard ratio=1.34 [1.19-1.51] per each additional segment displaying>50% transmural necrosis, P<.001). A simple and non-time consuming semiquantitative analysis of the extent of transmural necrosis is the most powerful cardiac magnetic resonance index to predict long-term outcome soon after a first ST-segment elevation myocardial infarction. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  8. Assessing the impact of a web-based comprehensive somatic and mental health screening tool in pediatric primary care.

    PubMed

    Fothergill, Kate E; Gadomski, Anne; Solomon, Barry S; Olson, Ardis L; Gaffney, Cecelia A; Dosreis, Susan; Wissow, Lawrence S

    2013-01-01

    To evaluate how parents and physicians perceive the utility of a comprehensive, electronic previsit screener, and to assess its impact on the visit. A mixed methods design was used. English-speaking parents were recruited from 3 primary care systems (urban MD and rural NY and VT) when they presented for a well-child visit with a child 4 to 10 years of age. Parents completed an electronic previsit screen, which included somatic concerns, health risks, and 4 mental health tools (SCARED5, PHQ-2, SDQ Impact, and PSC-17). Parents completed an exit survey, and a subset were interviewed. All primary care providers (PCPs) were interviewed. A total of 120 parents and 16 PCPs participated. The exit surveys showed that nearly 90% of parents agreed or strongly agreed that the screener was easy to use and maintained confidentiality. During interviews, parents noted that the screener helped with recall, validated concerns, reframed issues they thought might not be appropriate for primary care, and raised new questions. PCPs thought that the screener enabled them to normalize sensitive issues, and it permitted them to simultaneously focus and be comprehensive during the visit. Parents and PCPs agreed that the screener helped guide discussion, promoted in-depth exchange, and increased efficiency. Findings were consistent across quantitative and qualitative methods and between parents and PCPs. A comprehensive electronic previsit screening tool is an acceptable and practical strategy to facilitate well-child visits. It may help with problem identification as well as with setting agendas, engaging the family, and balancing attention between somatic and psychosocial concerns. Copyright © 2013 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  9. Democratization and integration of genomic profiling tools.

    PubMed

    Sussman, Michael R; Huttlin, Edward L; Wohlbach, Dana J

    2009-01-01

    Systems biology is a comprehensive means of creating a complete understanding of how all components of an organism work together to maintain and procreate life. By quantitatively profiling one at a time, the effect of thousands and millions of genetic and environmental perturbations on the cell, systems biologists are attempting to recreate and measure the effect of the many different states that have been explored during the 3 billion years in which life has evolved. A key aspect of this work is the development of innovative new approaches to quantify changes in the transcriptome, proteome, and metabolome. In this chapter we provide a review and evaluation of several genomic profiling techniques used in plant systems biology as well as make recommendations for future progress in their use and integration.

  10. Theory of sampling: four critical success factors before analysis.

    PubMed

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  11. Thermal control extravehicular life support system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The results of a comprehensive study which defined an Extravehicular Life Support System Thermal Control System (TCS) are presented. The design of the prototype hardware and a detail summary of the prototype TCS fabrication and test effort are given. Several heat rejection subsystems, water management subsystems, humidity control subsystems, pressure control schemes and temperature control schemes were evaluated. Alternative integrated TCS systems were studied, and an optimum system was selected based on quantitative weighing of weight, volume, cost, complexity and other factors. The selected subsystem contains a sublimator for heat rejection, bubble expansion tank for water management, a slurper and rotary separator for humidity control, and a pump, a temperature control valve, a gas separator and a vehicle umbilical connector for water transport. The prototype hardware complied with program objectives.

  12. From the ORFeome concept to highly comprehensive, full-genome screening libraries.

    PubMed

    Rid, Raphaela; Abdel-Hadi, Omar; Maier, Richard; Wagner, Martin; Hundsberger, Harald; Hintner, Helmut; Bauer, Johann; Onder, Kamil

    2013-02-01

    Recombination-based cloning techniques have in recent times facilitated the establishment of genome-scale single-gene ORFeome repositories. Their further handling and downstream application in systematic fashion is, however, practically impeded because of logistical plus economic challenges. At this juncture, simultaneously transferring entire gene collections in compiled pool format could represent an advanced compromise between systematic ORFeome (an organism's entire set of protein-encoding open reading frames) projects and traditional random library approaches, but has not yet been considered in great detail. In our endeavor to merge the comprehensiveness of ORFeomes with a basically simple, streamlined, and easily executable single-tube design, we have here produced five different pooled screening-ready libraries for both Staphylococcus aureus and Homo sapiens. By evaluating the parallel transfer efficiencies of differentially sized genes from initial polymerase chain reaction (PCR) product amplification to entry and final destination library construction via quantitative real-time PCR, we found that the complexity of the gene population is fairly stably maintained once an entry resource has been successfully established, and that no apparent size-selection bias loss of large inserts takes place. Recombinational transfer processes are hence robust enough for straightforwardly achieving such pooled screening libraries.

  13. Visualizing the qualitative: making sense of written comments from an evaluative satisfaction survey.

    PubMed

    Bletzer, Keith V

    2015-01-01

    Satisfaction surveys are common in the field of health education, as a means of assisting organizations to improve the appropriateness of training materials and the effectiveness of facilitation-presentation. Data can be qualitative of which analysis often become specialized. This technical article aims to reveal whether qualitative survey results can be visualized by presenting them as a Word Cloud. Qualitative materials in the form of written comments on an agency-specific satisfaction survey were coded and quantified. The resulting quantitative data were used to convert comments into "input terms" to generate Word Clouds to increase comprehension and accessibility through visualization of the written responses. A three-tier display incorporated a Word Cloud at the top, followed by the corresponding frequency table, and a textual summary of the qualitative data represented by the Word Cloud imagery. This mixed format adheres to recognition that people vary in what format is most effective for assimilating new information. The combination of visual representation through Word Clouds complemented by quantified qualitative materials is one means of increasing comprehensibility for a range of stakeholders, who might not be familiar with numerical tables or statistical analyses.

  14. HIV Rapid Testing in a VA Emergency Department Setting: Cost Analysis at 5 Years.

    PubMed

    Knapp, Herschel; Chan, Kee

    2015-07-01

    To conduct a comprehensive cost-minimization analysis to comprehend the financial attributes of the first 5 years of an implementation wherein emergency department (ED) registered nurses administered HIV oral rapid tests to patients. A health science research implementation team coordinated with ED stakeholders and staff to provide training, implementation guidelines, and support to launch ED registered nurse-administered HIV oral rapid testing. Deidentified quantitative data were gathered from the electronic medical records detailing quarterly HIV rapid test rates in the ED setting spanning the first 5 years. Comprehensive cost analyses were conducted to evaluate the financial impact of this implementation. At 5 years, a total of 2,620 tests were conducted with a quarterly mean of 131 ± 81. Despite quarterly variability in testing rates, regression analysis revealed an average increase of 3.58 tests per quarter. Over the course of this implementation, Veterans Health Administration policy transitioned from written to verbal consent for HIV testing, serving to reduce the time and cost(s) associated with the testing process. Our data indicated salient health outcome benefits for patients with respect to the potential for earlier detection, and associated long-run cost savings. Copyright © 2015. Published by Elsevier Inc.

  15. Computer language for identifying chemicals with comprehensive two-dimensional gas chromatography and mass spectrometry.

    PubMed

    Reichenbach, Stephen E; Kottapalli, Visweswara; Ni, Mingtian; Visvanathan, Arvind

    2005-04-15

    This paper describes a language for expressing criteria for chemical identification with comprehensive two-dimensional gas chromatography paired with mass spectrometry (GC x GC-MS) and presents computer-based tools implementing the language. The Computer Language for Indentifying Chemicals (CLIC) allows expressions that describe rules (or constraints) for selecting chemical peaks or data points based on multi-dimensional chromatographic properties and mass spectral characteristics. CLIC offers chromatographic functions of retention times, functions of mass spectra, numbers for quantitative and relational evaluation, and logical and arithmetic operators. The language is demonstrated with the compound-class selection rules described by Welthagen et al. [W. Welthagen, J. Schnelle-Kreis, R. Zimmermann, J. Chromatogr. A 1019 (2003) 233-249]. A software implementation of CLIC provides a calculator-like graphical user-interface (GUI) for building and applying selection expressions. From the selection calculator, expressions can be used to select chromatographic peaks that meet the criteria or create selection chromatograms that mask data points inconsistent with the criteria. Selection expressions can be combined with graphical, geometric constraints in the retention-time plane as a powerful component for chemical identification with template matching or used to speed and improve mass spectrum library searches.

  16. A Comprehensive Analysis in Terms of Molecule-Intrinsic, Quasi-Atomic Orbitals. III. The Covalent Bonding Structure of Urea.

    PubMed

    West, Aaron C; Schmidt, Michael W; Gordon, Mark S; Ruedenberg, Klaus

    2015-10-15

    The analysis of molecular electron density matrices in terms of quasi-atomic orbitals, which was developed in previous investigations, is quantitatively exemplified by a detailed application to the urea molecule. The analysis is found to identify strong and weak covalent bonding interactions as well as intramolecular charge transfers. It yields a qualitative as well as quantitative ab initio description of the bonding structure of this molecule, which raises questions regarding some traditional rationalizations.

  17. Health and Environmental Effects Profile for benzotrichloride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-07-01

    The Health and Environmental Effects Profile for benzotrichloride was prepared to support listings of hazardous constituents of a wide range of waste streams under Section 3001 of the Resource Conservation and Recovery Act (RCRA) and to provide health-related limits for emergency actions under Section 101 of the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). Both published literature and information obtained from Agency program office files were evaluated as they pertained to potential human health, aquatic life and environmental effects of hazardous waste constituents. Quantitative estimates are presented provided sufficient data are available. Benzotrichloride has been evaluated as a carcinogen.more » The human carcinogen potency factor for benzotrichloride is 12.63 (mg/kg/day) for oral exposure. The Reportable Quantity (RQ) value of 1, 10, 100, 1000 or 5000 pounds is used to determine the quantity of a hazardous substance for which notification is required in the event of a release as specified by CERCLA based on chronic toxicity. The RQ value for benzotrichloride is 10.« less

  18. An integrated framework for evaluating the effects of deforestation on ecosystem services

    NASA Astrophysics Data System (ADS)

    Song, X. P.; Huang, C.; Townshend, J. R.

    2014-03-01

    Deforestation often results in massive carbon emissions and loss of ecosystem services. The objective of this paper is to develop an integrated approach to quantitatively derive changes in forest carbon stock and changes in the economic value of forest carbon due to deforestation. Combining the best available remote sensing and socioeconomic datasets, this approach establishes a comprehensive baseline of deforestation in terms of area, carbon and monetary value change. We applied this end-to-end evaluation method in the Brazilian state of Rondonia to assess the ecological and economic effects of its recent deforestation from 2000 to 2005. Our results suggest that deforestation occurred at an average rate of 2834 km2/yr during the study period, leading to 31 TgC/yr "committed carbon emissions" from deforestation. Coupling with the social cost of carbon at 23/tC and a market discount rate at 7%, this translates to 622 million U.S. dollars/yr loss in the economic value of forest carbon.

  19. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  20. Evaluation of normalization methods in mammalian microRNA-Seq data

    PubMed Central

    Garmire, Lana Xia; Subramaniam, Shankar

    2012-01-01

    Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701

  1. Radiologic Assessment of Patellofemoral Pain in the Athlete

    PubMed Central

    Endo, Yoshimi; Stein, Beth E. Shubin; Potter, Hollis G.

    2011-01-01

    Context: Although disorders of the patellofemoral joint are common in the athlete, their management can be challenging and require a thorough physical examination and radiologic evaluation, including advanced magnetic resonance imaging techniques. Evidence Acquisition: Relevant articles were searched under OVID and MEDLINE (1968 to 2010) using the keywords patellofemoral joint, patellofemoral pain or patella and radiography, imaging, or magnetic resonance imaging, and the referenced sources were reviewed for additional articles. The quality and validity of the studies were assessed on the basis of careful analysis of the materials and methods before their inclusion in this article. Results: Physical examination and imaging evaluation including standard radiographs are crucial in identifying evidence of malalignment or instability. Magnetic resonance imaging provides valuable information about concomitant soft tissue injuries to the medial stabilizers as well as injuries to the articular cartilage, including chondral shears and osteochondral fractures. Quantitative magnetic resonance imaging assessing the ultrastructure of cartilage has shown high correlation with histology and may be useful for timing surgery. Conclusions: Evaluation of patellofemoral disorders is complex and requires a comprehensive assessment. Recent advancements in imaging have made possible a more precise evaluation of the individual anatomy of the patient, addressing issues of malalignment, instability, and underlying cartilage damage. PMID:23016009

  2. Review of Comprehensive Evaluation Methods for Power Quality and Its Trend in New Generation Energy System

    NASA Astrophysics Data System (ADS)

    Liu, Ruihua; Wang, Rong; Liu, Qunying; Yang, Li; Xi, Chuan; Wang, Wei; Li, Lingzhou; Zhao, Zhoufang; Zhou, Ying

    2018-02-01

    With China’s new energy generation grid connected capacity being in the forefront of the world and the uncertainty of new energy sources, such as wind energy and solar energy, it is be of great significance to study scientific and comprehensive assessment of power quality. On the foundation of analysizing the current power quality index systematically and objectively, the new energy grid power quality analysis method and comprehensive evaluation method, this paper tentatively explored the trend of the new generation of energy system power quality comprehensive evaluation.

  3. 77 FR 8298 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ... a quantitative, comprehensive or even representative survey or study of the burdens associated with... existing registered money market fund: Documentation of credit risk analyses, and determinations regarding... for documentation of credit analyses and other determinations) = 2340 responses. (30 funds x 12...

  4. Comprehensive Quantitative Model of Inner-Magnetosphere Dynamics

    NASA Technical Reports Server (NTRS)

    Wolf, Richard A.

    2002-01-01

    This report includes descriptions of papers, a thesis, and works still in progress which cover observations of space weather in the Earth's magnetosphere. The topics discussed include: 1) modelling of magnetosphere activity; 2) magnetic storms; 3) high energy electrons; and 4) plasmas.

  5. Red Meat and Colorectal Cancer: A Quantitative Update on the State of the Epidemiologic Science

    PubMed Central

    Alexander, Dominik D.; Weed, Douglas L.; Miller, Paula E.; Mohamed, Muhima A.

    2015-01-01

    The potential relationship between red meat consumption and colorectal cancer (CRC) has been the subject of scientific debate. Given the high degree of resulting uncertainty, our objective was to update the state of the science by conducting a systematic quantitative assessment of the epidemiologic literature. Specifically, we updated and expanded our previous meta-analysis by integrating data from new prospective cohort studies and conducting a broader evaluation of the relative risk estimates by specific intake categories. Data from 27 independent prospective cohort studies were meta-analyzed using random-effects models, and sources of potential heterogeneity were examined through subgroup and sensitivity analyses. In addition, a comprehensive evaluation of potential dose-response patterns was conducted. In the meta-analysis of all cohorts, a weakly elevated summary relative risk was observed (1.11, 95% CI: 1.03–1.19); however, statistically significant heterogeneity was present. In general, summary associations were attenuated (closer to the null and less heterogeneous) in models that isolated fresh red meat (from processed meat), adjusted for more relevant factors, analyzed women only, and were conducted in countries outside of the United States. Furthermore, no clear patterns of dose-response were apparent. In conclusion, the state of the epidemiologic science on red meat consumption and CRC is best described in terms of weak associations, heterogeneity, an inability to disentangle effects from other dietary and lifestyle factors, lack of a clear dose-response effect, and weakening evidence over time. Key Teaching Points: •The role of red meat consumption in colorectal cancer risk has been widely contested among the scientific community.•In the current meta-analysis of red meat intake and colorectal cancer, we comprehensively examined associations by creating numerous sub-group stratifications, conducting extensive sensitivity analyses, and evaluating dose-response using several different methods.•Overall, all summary associations were weak in magnitude with no clear dose-response patterns.•Interpretation of findings from epidemiologic studies investigating diet and health outcomes involves numerous methodological considerations, such as accurately measuring food intake, dietary pattern differences across populations, food definitions, outcome classifications, bias and confounding, multicollinearity, biological mechanisms, genetic variation in metabolizing enzymes, and differences in analytical metrics and statistical testing parameters. PMID:25941850

  6. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data.

    PubMed

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie; Zhang, Gong

    2018-01-04

    Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. A Research Agenda for Radiation Oncology: Results of the Radiation Oncology Institute's Comprehensive Research Needs Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagsi, Reshma, E-mail: rjagsi@med.umich.edu; Bekelman, Justin E.; Brawley, Otis W.

    Purpose: To promote the rational use of scarce research funding, scholars have developed methods for the systematic identification and prioritization of health research needs. The Radiation Oncology Institute commissioned an independent, comprehensive assessment of research needs for the advancement of radiation oncology care. Methods and Materials: The research needs assessment used a mixed-method, qualitative and quantitative social scientific approach, including structured interviews with diverse stakeholders, focus groups, surveys of American Society for Radiation Oncology (ASTRO) members, and a prioritization exercise using a modified Delphi technique. Results: Six co-equal priorities were identified: (1) Identify and develop communication strategies to help patientsmore » and others better understand radiation therapy; (2) Establish a set of quality indicators for major radiation oncology procedures and evaluate their use in radiation oncology delivery; (3) Identify best practices for the management of radiation toxicity and issues in cancer survivorship; (4) Conduct comparative effectiveness studies related to radiation therapy that consider clinical benefit, toxicity (including quality of life), and other outcomes; (5) Assess the value of radiation therapy; and (6) Develop a radiation oncology registry. Conclusions: To our knowledge, this prioritization exercise is the only comprehensive and methodologically rigorous assessment of research needs in the field of radiation oncology. Broad dissemination of these findings is critical to maximally leverage the impact of this work, particularly because grant funding decisions are often made by committees on which highly specialized disciplines such as radiation oncology are not well represented.« less

  8. Shared Decision-Making for Nursing Practice: An Integrative Review.

    PubMed

    Truglio-Londrigan, Marie; Slyer, Jason T

    2018-01-01

    Shared decision-making has received national and international interest by providers, educators, researchers, and policy makers. The literature on shared decision-making is extensive, dealing with the individual components of shared decision-making rather than a comprehensive process. This view of shared decision-making leaves healthcare providers to wonder how to integrate shared decision-making into practice. To understand shared decision-making as a comprehensive process from the perspective of the patient and provider in all healthcare settings. An integrative review was conducted applying a systematic approach involving a literature search, data evaluation, and data analysis. The search included articles from PubMed, CINAHL, the Cochrane Central Register of Controlled Trials, and PsycINFO from 1970 through 2016. Articles included quantitative experimental and non-experimental designs, qualitative, and theoretical articles about shared decision-making between all healthcare providers and patients in all healthcare settings. Fifty-two papers were included in this integrative review. Three categories emerged from the synthesis: (a) communication/ relationship building; (b) working towards a shared decision; and (c) action for shared decision-making. Each major theme contained sub-themes represented in the proposed visual representation for shared decision-making. A comprehensive understanding of shared decision-making between the nurse and the patient was identified. A visual representation offers a guide that depicts shared decision-making as a process taking place during a healthcare encounter with implications for the continuation of shared decisions over time offering patients an opportunity to return to the nurse for reconsiderations of past shared decisions.

  9. Comprehensive evaluation of candidate reference genes for gene expression studies in Lysiphlebia japonica (Hymenoptera: Aphidiidae) using RT-qPCR.

    PubMed

    Gao, Xue-Ke; Zhang, Shuai; Luo, Jun-Yu; Wang, Chun-Yi; Lü, Li-Min; Zhang, Li-Juan; Zhu, Xiang-Zhen; Wang, Li; Lu, Hui; Cui, Jin-Jie

    2017-12-30

    Lysiphlebia japonica (Ashmead) is a predominant parasitoid of cotton-melon aphids in the fields of northern China with a proven ability to effectively control cotton aphid populations in early summer. For accurate normalization of gene expression in L. japonica using quantitative reverse transcriptase-polymerase chain reaction (RT-qPCR), reference genes with stable gene expression patterns are essential. However, no appropriate reference genes is L. japonica have been investigated to date. In the present study, 12 selected housekeeping genes from L. japonica were cloned. We evaluated the stability of these genes under various experimental treatments by RT-qPCR using four independent (geNorm, NormFinder, BestKeeper and Delta Ct) and one comparative (RefFinder) algorithm. We identified genes showing the most stable levels of expression: DIMT, 18S rRNA, and RPL13 during different stages; AK, RPL13, and TBP among sexes; EF1A, PPI, and RPL27 in different tissues, and EF1A, RPL13, and PPI in adults fed on different diets. Moreover, the expression profile of a target gene (odorant receptor 1, OR1) studied during the developmental stages confirms the reliability of the chosen selected reference genes. This study provides for the first time a comprehensive list of suitable reference genes for gene expression studies in L. japonica and will benefit subsequent genomics and functional genomics research on this natural enemy. Copyright © 2017. Published by Elsevier B.V.

  10. Comprehensive automatic assessment of retinal vascular abnormalities for computer-assisted retinopathy grading.

    PubMed

    Joshi, Vinayak; Agurto, Carla; VanNess, Richard; Nemeth, Sheila; Soliz, Peter; Barriga, Simon

    2014-01-01

    One of the most important signs of systemic disease that presents on the retina is vascular abnormalities such as in hypertensive retinopathy. Manual analysis of fundus images by human readers is qualitative and lacks in accuracy, consistency and repeatability. Present semi-automatic methods for vascular evaluation are reported to increase accuracy and reduce reader variability, but require extensive reader interaction; thus limiting the software-aided efficiency. Automation thus holds a twofold promise. First, decrease variability while increasing accuracy, and second, increasing the efficiency. In this paper we propose fully automated software as a second reader system for comprehensive assessment of retinal vasculature; which aids the readers in the quantitative characterization of vessel abnormalities in fundus images. This system provides the reader with objective measures of vascular morphology such as tortuosity, branching angles, as well as highlights of areas with abnormalities such as artery-venous nicking, copper and silver wiring, and retinal emboli; in order for the reader to make a final screening decision. To test the efficacy of our system, we evaluated the change in performance of a newly certified retinal reader when grading a set of 40 color fundus images with and without the assistance of the software. The results demonstrated an improvement in reader's performance with the software assistance, in terms of accuracy of detection of vessel abnormalities, determination of retinopathy, and reading time. This system enables the reader in making computer-assisted vasculature assessment with high accuracy and consistency, at a reduced reading time.

  11. An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe

    2015-12-26

    Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less

  12. The Study of Literacy Reinforcement of Science Teachers in Implementing 2013 Curriculum

    NASA Astrophysics Data System (ADS)

    Dewi, W. S.; Festiyed, F.; Hamdi, H.; Sari, S. Y.

    2018-04-01

    This research aims to study and collect data comprehensively, new and actual about science literacy to improve the ability of educators in implementing the 2013 Curriculum at Junior High School Padang Pariaman District. The specific benefit of this research is to give description and to know the problem of science literacy problem in interaction among teacher, curriculum, facilities and infrastructure, evaluation, learning technology and students. This study uses explorative in deep study approach, studying and collecting data comprehensively from the interaction of education process components (curriculum, educator, learner, facilities and infrastructure, learning media technology, and evaluation) that influence the science literacy. This research was conducted in the districts of Padang Pariaman consisting of 17 subdistricts and 84 junior high schools managed by the government and private. The sample of this research is science teachers of Padang Pariaman District with sampling technique is stratified random sampling. The instrument used in this study is a questionnaire to the respondents. Research questionnaire data are processed by percentage techniques (quantitative). The results of this study explain that the understanding of science teachers in Padang Pariaman District towards the implementation of 2013 Curriculum is still lacking. The science teachers of Padang Pariaman District have not understood the scientific approach and the effectiveness of 2013 Curriculum in shaping the character of the students. To improve the understanding of the implementation of Curriculum 2013, it is necessary to strengthen the literacy toward science teachers at the Junior High School level in Padang Pariaman District.

  13. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  14. The Evaluation of Land Use Status in Mountainous Counties of Southwest China Based on Comprehensive Evaluation Models: A Case Study of Baoxing County, Sichuan Province

    NASA Astrophysics Data System (ADS)

    Cao, Mengtian; Shen, Jundi; Chen, Zhehua

    2018-06-01

    In mountainous areas of Southwest China, the land resources are scarce, and the ecological environment is fragile, so it is particularly important to carry out the evaluation of land use status for the sustainability of land development. Taking Baoxing County in Sichuan Province, the typical mountainous county in Southwest China, as an instance, this study refers to the existing research frameworks to establish the evaluation system of land use status. Meanwhile, the comprehensive evaluation models are used to evaluate land use status. As indicated from the results, in Baoxing County, the comprehensive evaluation score of the overall status of land use, the evaluation score of the development degree of land, the evaluation score of the intensive management degree of land and the evaluation score of the comprehensive benefits of land were 83.5, 108.24, 72.25 and 80.77, respectively. Land use status is generally at the relatively rational use stage, and the main problems are the lack of land investment and the low mechanization level of agricultural production. It is suggested to increase the financial investment in land and enhance the intensive degree and comprehensive benefits of land in the future.

  15. Preparing pharmacists to deliver a targeted service in hypertension management: evaluation of an interprofessional training program.

    PubMed

    Bajorek, Beata V; Lemay, Kate S; Magin, Parker J; Roberts, Christopher; Krass, Ines; Armour, Carol L

    2015-09-28

    Non-adherence to medicines by patients and suboptimal prescribing by clinicians underpin poor blood pressure (BP) control in hypertension. In this study, a training program was designed to enable community pharmacists to deliver a service in hypertension management targeting therapeutic adjustments and medication adherence. A comprehensive evaluation of the training program was undertaken. Tailored training comprising a self-directed pre-work manual, practical workshop (using real patients), and practice scenarios, was developed and delivered by an inter-professional team (pharmacists, GPs). Supported by practical and written assessment, the training focused on the principles of BP management, BP measurement skills, and adherence strategies. Pharmacists' experience of the training (expectations, content, format, relevance) was evaluated quantitatively and qualitatively. Immediate feedback was obtained via a questionnaire comprising Likert scales (1 = "very well" to 7 = "poor") and open-ended questions. Further in-depth qualitative evaluation was undertaken via semi-structured interviews several months post-training (and post service implementation). Seventeen pharmacists were recruited, trained and assessed as competent. All were highly satisfied with the training; other than the 'amount of information provided' (median score = 5, "just right"), all aspects of training attained the most positive score of '1'. Pharmacists most valued the integrated team-based approach, GP involvement, and inclusion of real patients, as well as the pre-reading manual, BP measurement workshop, and case studies (simulation). Post-implementation the interviews highlighted that comprehensive training increased pharmacists' confidence in providing the service, however, training of other pharmacy staff and patient recruitment strategies were highlighted as a need in future. Structured, multi-modal training involving simulated and inter-professional learning is effective in preparing selected community pharmacists for the implementation of new services in the context of hypertension management. This training could be further enhanced to prepare pharmacists for the challenges encountered in implementing and evaluating services in practice.

  16. Quantitative phosphoproteomic analysis of caprine muscle with high and low meat quality.

    PubMed

    Liu, Manshun; Wei, Yanchao; Li, Xin; Quek, Siew Young; Zhao, Jing; Zhong, Huazhen; Zhang, Dequan; Liu, Yongfeng

    2018-07-01

    During the conversion of muscle to meat, protein phosphorylation can regulate various biological processes that have important effects on meat quality. To investigate the phosphorylation pattern of protein on rigor mortis, goat longissimus thoracis and external intercostals were classified into two groups (high quality and low quality), and meat quality was evaluated according to meat quality attributes (Warner-Bratzler shear force, Color, pH and drip loss). A quantitative mass spectrometry-based phosphoproteomic study was conducted to analyze the caprine muscle at 12h postmortem applying the TiO 2 -SIMAC-HILIC (TiSH) phosphopeptide enrichment strategy. A total of 2125 phosphopeptides were identified from 750 phosphoproteins. Among them, 96 proteins had differed in phosphorylation levels. The majority of these proteins are involved in glucose metabolism and muscle contraction. The differential phosphorylation level of proteins (PFK, MYL2 and HSP27) in two groups may be the crucial factors of regulating muscle rigor mortis. This study provides a comprehensive view for the phosphorylation status of caprine muscle at rigor mortis, it also gives a better understanding of the regulation of protein phosphorylation on various biological processes that affect the final meat quality attributes. Copyright © 2018. Published by Elsevier Ltd.

  17. Evaluation of body-wise and organ-wise registrations for abdominal organs

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Panjwani, Sahil A.; Lee, Christopher P.; Burke, Ryan P.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Abramson, Richard G.; Landman, Bennett A.

    2016-03-01

    Identifying cross-sectional and longitudinal correspondence in the abdomen on computed tomography (CT) scans is necessary for quantitatively tracking change and understanding population characteristics, yet abdominal image registration is a challenging problem. The key difficulty in solving this problem is huge variations in organ dimensions and shapes across subjects. The current standard registration method uses the global or body-wise registration technique, which is based on the global topology for alignment. This method (although producing decent results) has substantial influence of outliers, thus leaving room for significant improvement. Here, we study a new image registration approach using local (organ-wise registration) by first creating organ-specific bounding boxes and then using these regions of interest (ROIs) for aligning references to target. Based on Dice Similarity Coefficient (DSC), Mean Surface Distance (MSD) and Hausdorff Distance (HD), the organ-wise approach is demonstrated to have significantly better results by minimizing the distorting effects of organ variations. This paper compares exclusively the two registration methods by providing novel quantitative and qualitative comparison data and is a subset of the more comprehensive problem of improving the multi-atlas segmentation by using organ normalization.

  18. A Constitutive Relationship for Gravelly Soil Considering Fine Particle Suffusion

    PubMed Central

    Zhang, Yuning; Chen, Yulong

    2017-01-01

    Suffusion erosion may occur in sandy gravel dam foundations that use suspended cutoff walls. This erosion causes a loss of fine particles, degrades the soil strength and deformation moduli, and adversely impacts the cutoff walls of the dam foundation, as well as the overlying dam body. A comprehensive evaluation of these effects requires models that quantitatively describe the effects of fine particle losses on the stress-strain relationships of sandy gravels. In this work, we propose an experimental scheme for studying these types of models, and then perform triaxial and confined compression tests to determine the effects of particle losses on the stress-strain relationships. Considering the Duncan-Chang E-B model, quantitative expressions describing the relationship between the parameters of the model and the particle losses were derived. The results show that particle losses did not alter the qualitative stress-strain characteristics of the soils; however, the soil strength and deformation moduli were degraded. By establishing the relationship between the parameters of the model and the losses, the same model can then be used to describe the relationship between sandy gravels and erosion levels that vary in both time and space. PMID:29065532

  19. Quantitative assessment of heteroplasmy of mitochondrial genome: perspectives in diagnostics and methodological pitfalls.

    PubMed

    Sobenin, Igor A; Mitrofanov, Konstantin Y; Zhelankin, Andrey V; Sazonova, Margarita A; Postnov, Anton Y; Revin, Victor V; Bobryshev, Yuri V; Orekhov, Alexander N

    2014-01-01

    The role of alterations of mitochondrial DNA (mtDNA) in the development of human pathologies is not understood well. Most of mitochondrial mutations are characterized by the phenomenon of heteroplasmy which is defined as the presence of a mixture of more than one type of an organellar genome within a cell or tissue. The level of heteroplasmy varies in wide range, and the expression of disease is dependent on the percent of alleles bearing mutations, thus allowing consumption that an upper threshold level may exist beyond which the mitochondrial function collapses. Recent findings have demonstrated that some mtDNA heteroplasmic mutations are associated with widely spread chronic diseases, including atherosclerosis and cancer. Actually, each etiological mtDNA mutation has its own heteroplasmy threshold that needs to be measured. Therefore, quantitative evaluation of a mutant allele of mitochondrial genome is an obvious methodological challenge, since it may be a keystone for diagnostics of individual genetic predisposition to the disease. This review provides a comprehensive comparison of methods applicable to the measurement of heteroplasmy level of mitochondrial mutations associated with the development of pathology, in particular, in atherosclerosis and its clinical manifestations.

  20. Tracking antibiotic resistome during wastewater treatment using high throughput quantitative PCR.

    PubMed

    An, Xin-Li; Su, Jian-Qiang; Li, Bing; Ouyang, Wei-Ying; Zhao, Yi; Chen, Qing-Lin; Cui, Li; Chen, Hong; Gillings, Michael R; Zhang, Tong; Zhu, Yong-Guan

    2018-05-08

    Wastewater treatment plants (WWTPs) contain diverse antibiotic resistance genes (ARGs), and thus are considered as a major pathway for the dissemination of these genes into the environments. However, comprehensive evaluations of ARGs dynamic during wastewater treatment process lack extensive investigations on a broad spectrum of ARGs. Here, we investigated the dynamics of ARGs and bacterial community structures in 114 samples from eleven Chinese WWTPs using high-throughput quantitative PCR and 16S rRNA-based Illumina sequencing analysis. Significant shift of ARGs profiles was observed and wastewater treatment process could significantly reduce the abundance and diversity of ARGs, with the removal of ARGs concentration by 1-2 orders of magnitude. Whereas, a considerable number of ARGs were detected and enriched in effluents compared with influents. In particular, seven ARGs mainly conferring resistance to beta-lactams and aminoglycosides and three mobile genetic elements persisted in all WWTPs samples after wastewater treatment. ARGs profiles varied with wastewater treatment processes, seasons and regions. This study tracked the footprint of ARGs during wastewater treatment process, which would support the assessment on the spread of ARGs from WWTPs and provide data for identifying management options to improve ARG mitigation in WWTPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Exploring Valid Reference Genes for Quantitative Real-time PCR Analysis in Plutella xylostella (Lepidoptera: Plutellidae)

    PubMed Central

    Fu, Wei; Xie, Wen; Zhang, Zhuo; Wang, Shaoli; Wu, Qingjun; Liu, Yong; Zhou, Xiaomao; Zhou, Xuguo; Zhang, Youjun

    2013-01-01

    Abstract: Quantitative real-time PCR (qRT-PCR), a primary tool in gene expression analysis, requires an appropriate normalization strategy to control for variation among samples. The best option is to compare the mRNA level of a target gene with that of reference gene(s) whose expression level is stable across various experimental conditions. In this study, expression profiles of eight candidate reference genes from the diamondback moth, Plutella xylostella, were evaluated under diverse experimental conditions. RefFinder, a web-based analysis tool, integrates four major computational programs including geNorm, Normfinder, BestKeeper, and the comparative ΔCt method to comprehensively rank the tested candidate genes. Elongation factor 1 (EF1) was the most suited reference gene for the biotic factors (development stage, tissue, and strain). In contrast, although appropriate reference gene(s) do exist for several abiotic factors (temperature, photoperiod, insecticide, and mechanical injury), we were not able to identify a single universal reference gene. Nevertheless, a suite of candidate reference genes were specifically recommended for selected experimental conditions. Our finding is the first step toward establishing a standardized qRT-PCR analysis of this agriculturally important insect pest. PMID:23983612

  2. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  3. End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.

    PubMed

    Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen

    2018-06-15

    An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.

  4. Assessment of sustainable urban transport development based on entropy and unascertained measure.

    PubMed

    Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie

    2017-01-01

    To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.

  5. A Comprehensive Evaluation System for Military Hospitals' Response Capability to Bio-terrorism.

    PubMed

    Wang, Hui; Jiang, Nan; Shao, Sicong; Zheng, Tao; Sun, Jianzhong

    2015-05-01

    The objective of this study is to establish a comprehensive evaluation system for military hospitals' response capacity to bio-terrorism. Literature research and Delphi method were utilized to establish the comprehensive evaluation system for military hospitals' response capacity to bio-terrorism. Questionnaires were designed and used to survey the status quo of 134 military hospitals' response capability to bio-terrorism. Survey indicated that factor analysis method was suitable to for analyzing the comprehensive evaluation system for military hospitals' response capacity to bio-terrorism. The constructed evaluation system was consisted of five first-class and 16 second-class indexes. Among them, medical response factor was considered as the most important factor with weight coefficient of 0.660, followed in turn by the emergency management factor with weight coefficient of 0.109, emergency management consciousness factor with weight coefficient of 0.093, hardware support factor with weight coefficient of 0.078, and improvement factor with weight coefficient of 0.059. The constructed comprehensive assessment model and system are scientific and practical.

  6. STRUCTURAL DETERMINATION AND QUANTITATIVE ANALYSIS OF BACTERIAL PHOSPHOLIPIDS USING LIQUID CHROMATOGRAPHY/ELECTROSPRAY IONIZATION/MASS SPECTROMETRY

    EPA Science Inventory

    This report presents a comprehensive spectral analysis of common bacterial phospholipids using electrospray/mass spectrometry (ESI/MS) under both negative and positive ionization conditions. Phospholipids under positive ionization yield sodium-adduct molecular ions which are mos...

  7. 77 FR 23515 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-19

    ... from a quantitative, comprehensive or even representative survey or study of the burdens associated... existing registered money market fund: Documentation of credit risk analyses, and determinations regarding... for documentation of credit analyses and other determinations) = 2340 responses. (30 funds x 12...

  8. Effects of Embedded Processing Tasks on Learning Outcomes.

    ERIC Educational Resources Information Center

    Hobbs, D. J.

    1987-01-01

    Describes a British study with undergraduate accountancy students which compared the quantitative and qualitative effects of three types of embedded tasks or questions--relational-semantic, transpose-semantic, and non-semantic--on learning outcomes. Variables investigated included mathematical background, recall, and comprehension. Relevance of…

  9. Evaluation of Attention Training and Metacognitive Facilitation to Improve Reading Comprehension in Aphasia

    ERIC Educational Resources Information Center

    Lee, Jaime B.; Sohlberg, McKay Moore

    2013-01-01

    Purpose: This pilot study investigated the impact of direct attention training combined with metacognitive facilitation on reading comprehension in individuals with aphasia. Method: A single-subject, multiple baseline design was employed across 4 participants to evaluate potential changes in reading comprehension resulting from an 8-week…

  10. Evaluative methodology for prioritizing transportation energy conservation strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, L.M.G.

    An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less

  11. Evaluating the effectiveness of clinical medical librarian programs: a systematic review of the literature*

    PubMed Central

    Wagner, Kay Cimpl; Byrd, Gary D.

    2004-01-01

    Objective: This study was undertaken to determine if a systematic review of the evidence from thirty years of literature evaluating clinical medical librarian (CML) programs could help clarify the effectiveness of this outreach service model. Methods: A descriptive review of the CML literature describes the general characteristics of these services as they have been implemented, primarily in teaching-hospital settings. Comprehensive searches for CML studies using quantitative or qualitative evaluation methods were conducted in the medical, allied health, librarianship, and social sciences literature. Findings: Thirty-five studies published between 1974 and 2001 met the review criteria. Most (30) evaluated single, active programs and used descriptive research methods (e.g., use statistics or surveys/questionnaires). A weighted average of 89% of users in twelve studies found CML services useful and of high quality, and 65% of users in another overlapping, but not identical, twelve studies said these services contributed to improved patient care. Conclusions: The total amount of research evidence for CML program effectiveness is not great and most of it is descriptive rather than comparative or analytically qualitative. Standards are needed to consistently evaluate CML or informationist programs in the future. A carefully structured multiprogram study including three to five of the best current programs is needed to define the true value of these services. PMID:14762460

  12. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  13. Analysis and evaluation of the applicability of green energy technology

    NASA Astrophysics Data System (ADS)

    Xu, Z. J.; Song, Y. K.

    2017-11-01

    With the seriousness of environmental issues and the shortage of resources, the applicability of green energy technology has been paid more and more attention by scholars in different fields. However, the current researches are often single in perspective and simple in method. According to the Theory of Applicable Technology, this paper analyzes and defines the green energy technology and its applicability from the all-around perspectives of symbiosis of economy, society, environment and science & technology etc., and correspondingly constructs the evaluation index system. The paper further applies the Fuzzy Comprehensive Evaluation to the evaluation of its applicability, discusses in depth the evaluation models and methods, and explains in detail with an example. The author holds that the applicability of green energy technology involves many aspects of economy, society, environment and science & technology and can be evaluated comprehensively by an index system composed of a number of independent indexes. The evaluation is multi-object, multi-factor, multi-level and fuzzy comprehensive, which is undoubtedly correct, effective and feasible by the Fuzzy Comprehensive Evaluation. It is of vital theoretical and practical significance to understand and evaluate comprehensively the applicability of green energy technology for the rational development and utilization of green energy technology and for the better promotion of sustainable development of human and nature.

  14. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram

    PubMed Central

    2015-01-01

    Objectives Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. Methods This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. Results It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. Conclusions To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions. PMID:26618028

  15. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    PubMed

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  16. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  17. A multi-method approach toward de novo glycan characterization: a Man-5 case study.

    PubMed

    Prien, Justin M; Prater, Bradley D; Cockrill, Steven L

    2010-05-01

    Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.

  18. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    PubMed

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Rethinking health numeracy: a multidisciplinary literature review.

    PubMed

    Ancker, Jessica S; Kaufman, David

    2007-01-01

    The purpose of this review is to organize various published conceptions of health numeracy and to discuss how health numeracy contributes to the productive use of quantitative information for health. We define health numeracy as the individual-level skills needed to understand and use quantitative health information, including basic computation skills, ability to use information in documents and non-text formats such as graphs, and ability to communicate orally. We also identify two other factors affecting whether a consumer can use quantitative health information: design of documents and other information artifacts, and health-care providers' communication skills. We draw upon the distributed cognition perspective to argue that essential ingredients for the productive use of quantitative health information include not only health numeracy but also good provider communication skills, as well as documents and devices that are designed to enhance comprehension and cognition.

  20. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  1. A Novel Method for Relative Quantitation of N-Glycans by Isotopic Labeling Using 18O-Water

    PubMed Central

    Tao, Shujuan; Orlando, Ron

    2014-01-01

    Quantitation is an essential aspect of comprehensive glycomics study. Here, a novel isotopic-labeling method is described for N-glycan quantitation using 18O-water. The incorporation of the 18O-labeling into the reducing end of N-glycans is simply and efficiently achieved during peptide-N4-(N-acetyl-β-glucosaminyl) asparagine amidase F release. This process provides a 2-Da mass difference compared with the N-glycans released in 16O-water. A mathematical calculation method was also developed to determine the 18O/16O ratios from isotopic peaks. Application of this method to several standard glycoprotein mixtures and human serum demonstrated that this method can facilitate the relative quantitation of N-glycans over a linear dynamic range of two orders, with high accuracy and reproducibility. PMID:25365792

  2. Health Information Technology Evaluation Framework (HITREF) Comprehensiveness as Assessed in Electronic Point-of-Care Documentation Systems Evaluations.

    PubMed

    Sockolow, Paulina S; Bowles, Kathryn H; Rogers, Michelle

    2015-01-01

    We assessed the Health Information Technology (HIT) Reference-based Evaluation Framework (HITREF) comprehensiveness in two HIT evaluations in settings different from that in which the HITREF was developed. Clinician satisfaction themes that emerged from clinician interviews in the home care and the hospital studies were compared to the framework components. Across both studies, respondents commented on 12 of the 20 HITREF components within 5 of the 6 HITREF concepts. No new components emerged that were missing from the HITREF providing evidence that the HITREF is a comprehensive framework. HITREF use in a range of HIT evaluations by researchers new to the HITREF demonstrates that it can be used as intended. Therefore, we continue to recommend the HITREF as a comprehensive, research-based HIT evaluation framework to increase the capacity of informatics evaluators' use of best practice and evidence-based practice to support the credibility of their findings for fulfilling the purpose of program evaluation.

  3. Plasticity of the myelination genomic fabric.

    PubMed

    Iacobas, Sanda; Thomas, Neil M; Iacobas, Dumitru A

    2012-03-01

    This study aimed to quantify the influence of the astrocyte proximity on myelination genomic fabric (MYE) of oligodendrocytes, defined as the most interconnected and stably expressed gene web responsible for myelination. Such quantitation is important to evaluate whether astrocyte signaling may contribute to demyelination when impaired and remyelination when properly restored. For this, we compared changes in the gene expression profiles of immortalized precursor oligodendrocytes (Oli-neu), stimulated to differentiate by the proximity of nontouching astrocytes or treatment with db-cAMP. In a previous paper, we reported that the astrocyte proximity upregulated or turned-on a large number of myelination genes and substantially enriched the Ca(2+)-signaling and cytokine receptor regulatory networks of MYE in Oli-neu cells. Here, we introduce the "transcriptomic distance" to evaluate fabric remodeling and "pair-wise relevance" to identify the most influential gene pairs. Together with the prominence gene analysis used to select and rank the fabric genes, these novel analytical tools provide a comprehensively quantitative view of the physio/pathological transformations of the transcriptomic programs of myelinating cells. Applied to our data, the analyses revealed not only that the astrocyte neighborhood is a substantially more powerful regulator of myelination than the differentiating treatment but also the molecular mechanisms of the two differentiating paradigms are different. By inducing a profound remodeling of MYE and regulatory transcriptomic networks, the astrocyte-oligodendrocyte intercommunication may be considered as a major player in both pathophysiology and therapy of neurodegenerative diseases related to myelination.

  4. An Integrated Strategy for Global Qualitative and Quantitative Profiling of Traditional Chinese Medicine Formulas: Baoyuan Decoction as a Case

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2016-12-01

    Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.

  5. [Analysis of aromatic hydrocarbons in cracking products of jet fuel by comprehensive two-dimensional gas chromatography-mass spectrometry].

    PubMed

    Li, Haijing; Zhang, Xiangwen

    2017-08-08

    As coking precursors, aromatic hydrocarbons have an effect on the cracking stability of fuels. A method for identifying and quantitating aromatics in the supercritical cracking products of jet fuel was established by comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS). The effects of main chromatographic conditions such as initial oven temperature and modulation period on the separation of supercritical cracking products were studied. The method has good separation ability for polycyclic aromatic hydrocarbons (PAH) isomers. A total of 27 aromatics, including monocyclic aromatic hydrocarbons, bicyclic aromatic hydrocarbons, tricyclic aromatic hydrocarbons, tetracyclic aromatic hydrocarbons, etc., were identified based on standard mass spectra, the retention times of standards and literature reports. Moreover, the corresponding quantitative determination was achieved by external standard method of GC×GC-FID. The results showed that the contents of aromatics increased with the increase of gas yield. When gas yield reached 22%, the bicyclic aromatic hydrocarbons began to produce, and their contents increased exponentially with the increase of gas yield. Compared with the traditional GC-MS, the method has better separation and qualitative ability, and can be applied to the separation of complex samples and qualitative and quantitative analyses of cracking products.

  6. Evaluation of use of reading comprehension strategies to improve reading comprehension of adult college students with acquired brain injury.

    PubMed

    Griffiths, Gina G; Sohlberg, McKay Moore; Kirk, Cecilia; Fickas, Stephen; Biancarosa, Gina

    2016-01-01

    Adults with mild to moderate acquired brain injury (ABI) often pursue post-secondary or professional education after their injuries in order to enter or re-enter the job market. An increasing number of these adults report problems with reading-to-learn. The problem is particularly concerning given the growing population of adult survivors of ABI. Despite the rising need, empirical evaluation of reading comprehension interventions for adults with ABI is scarce. This study used a within-subject design to evaluate whether adult college students with ABI with no more than moderate cognitive impairments benefited from using reading comprehension strategies to improve comprehension of expository text. Integrating empirical support from the cognitive rehabilitation and special education literature, the researchers designed a multi-component reading comprehension strategy package. Participants read chapters from an introductory-level college anthropology textbook in two different conditions: strategy and no-strategy. The results indicated that reading comprehension strategy use was associated with recall of more correct information units in immediate and delayed free recall tasks; more efficient recall in the delayed free recall task; and increased accuracy recognising statements from a sentence verification task designed to reflect the local and global coherence of the text. The findings support further research into using reading comprehension strategies as an intervention approach for the adult ABI population. Future research needs include identifying how to match particular reading comprehension strategies to individuals, examining whether reading comprehension performance improves further through the incorporation of systematic training, and evaluating texts from a range of disciplines and genres.

  7. SoyBase, The USDA-ARS Soybean Genetics and Genomics Database

    USDA-ARS?s Scientific Manuscript database

    SoyBase, the USDA-ARS soybean genetic database, is a comprehensive repository for professionally curated genetics, genomics and related data resources for soybean. SoyBase contains the most current genetic, physical and genomic sequence maps integrated with qualitative and quantitative traits. The...

  8. APPLICATION OF THE MASTER ANALYTICAL SCHEME TO POLAR ORGANICS IN DRINKING WATER

    EPA Science Inventory

    EPA's Master Analytical Scheme (MAS) for Organic Compounds in Water provides for comprehensive qualitative-quantitative analysis of gas chromatographable organics in many types of water. The paper emphasizes the analysis of polar and ionic organics, the more water soluble compoun...

  9. A comprehensive and quantitative exploration of thousands of viral genomes

    PubMed Central

    Mahmoudabadi, Gita

    2018-01-01

    The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends – such as gene density, noncoding percentage, and abundances of functional gene categories – across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. PMID:29624169

  10. Quantitative and qualitative differences in the lexical knowledge of monolingual and bilingual children on the LITMUS-CLT task.

    PubMed

    Altman, Carmit; Goldstein, Tamara; Armon-Lotem, Sharon

    2017-01-01

    While bilingual children follow the same milestones of language acquisition as monolingual children do in learning the syntactic patterns of their second language (L2), their vocabulary size in L2 often lags behind compared to monolinguals. The present study explores the comprehension and production of nouns and verbs in Hebrew, by two groups of 5- to 6-year olds with typical language development: monolingual Hebrew speakers (N = 26), and Russian-Hebrew bilinguals (N = 27). Analyses not only show quantitative gaps between comprehension and production and between nouns and verbs, with a bilingual effect in both, but also a qualitative difference between monolinguals and bilinguals in their production errors: monolinguals' errors reveal knowledge of the language rules despite temporary access difficulties, while bilinguals' errors reflect gaps in their knowledge of Hebrew (L2). The nature of Hebrew as a Semitic language allows one to explore this qualitative difference in the semantic and morphological level.

  11. A comprehensive model of ion diffusion and charge exchange in the cold Io torus

    NASA Technical Reports Server (NTRS)

    Barbosa, D. D.; Moreno, M. A.

    1988-01-01

    A comprehensive analytic model of radial diffusion in the cold Io torus is developed. The model involves a generalized molecular cloud theory of SO2 and its dissociation fragments SO, O2, S, and O, which are formed at a relatively large rate by solar UV photodissociation of SO2. The key component of the new theory is SO, which can react with S(+) through a near-resonant charge exchange process that is exothermic. This provides a mechanism for the rapid depletion of singly ionized sulfur in the cold torus and can account for the large decrease in the total flux tube content inward of Io's orbit. The model is used to demonstrate quantitatively the effects of radial diffusion in a charge exchange environment that acts as a combined source and sink for ions in various charge states. A detailed quantitative explanation for the O(2+) component of the cold torus is given, and insight is derived into the workings of the so-called plasma 'ribbon'.

  12. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    PubMed

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A comprehensive and quantitative exploration of thousands of viral genomes.

    PubMed

    Mahmoudabadi, Gita; Phillips, Rob

    2018-04-19

    The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends - such as gene density, noncoding percentage, and abundances of functional gene categories - across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. © 2018, Mahmoudabadi et al.

  14. A Novel Quantitative Hemolytic Assay Coupled with Restriction Fragment Length Polymorphisms Analysis Enabled Early Diagnosis of Atypical Hemolytic Uremic Syndrome and Identified Unique Predisposing Mutations in Japan

    PubMed Central

    Yoshida, Yoko; Miyata, Toshiyuki; Matsumoto, Masanori; Shirotani-Ikejima, Hiroko; Uchida, Yumiko; Ohyama, Yoshifumi; Kokubo, Tetsuro; Fujimura, Yoshihiro

    2015-01-01

    For thrombotic microangiopathies (TMAs), the diagnosis of atypical hemolytic uremic syndrome (aHUS) is made by ruling out Shiga toxin-producing Escherichia coli (STEC)-associated HUS and ADAMTS13 activity-deficient thrombotic thrombocytopenic purpura (TTP), often using the exclusion criteria for secondary TMAs. Nowadays, assays for ADAMTS13 activity and evaluation for STEC infection can be performed within a few hours. However, a confident diagnosis of aHUS often requires comprehensive gene analysis of the alternative complement activation pathway, which usually takes at least several weeks. However, predisposing genetic abnormalities are only identified in approximately 70% of aHUS. To facilitate the diagnosis of complement-mediated aHUS, we describe a quantitative hemolytic assay using sheep red blood cells (RBCs) and human citrated plasma, spiked with or without a novel inhibitory anti-complement factor H (CFH) monoclonal antibody. Among 45 aHUS patients in Japan, 24% (11/45) had moderate-to-severe (≥50%) hemolysis, whereas the remaining 76% (34/45) patients had mild or no hemolysis (<50%). The former group is largely attributed to CFH-related abnormalities, and the latter group has C3-p.I1157T mutations (16/34), which were identified by restriction fragment length polymorphism (RFLP) analysis. Thus, a quantitative hemolytic assay coupled with RFLP analysis enabled the early diagnosis of complement-mediated aHUS in 60% (27/45) of patients in Japan within a week of presentation. We hypothesize that this novel quantitative hemolytic assay would be more useful in a Caucasian population, who may have a higher proportion of CFH mutations than Japanese patients. PMID:25951460

  15. The health impact of trade and investment agreements: a quantitative systematic review and network co-citation analysis.

    PubMed

    Barlow, Pepita; McKee, Martin; Basu, Sanjay; Stuckler, David

    2017-03-08

    Regional trade agreements are major international policy instruments that shape macro-economic and political systems. There is widespread debate as to whether and how these agreements pose risks to public health. Here we perform a comprehensive systematic review of quantitative studies of the health impact of trade and investment agreements. We identified studies from searches in PubMed, Web of Science, EMBASE, and Global Health Online. Research articles were eligible for inclusion if they were quantitative studies of the health impacts of trade and investment agreements or policy. We systematically reviewed study findings, evaluated quality using the Quality Assessment Tool from the Effective Public Health Practice Project, and performed network citation analysis to study disciplinary siloes. Seventeen quantitative studies met our inclusion criteria. There was consistent evidence that implementing trade agreements was associated with increased consumption of processed foods and sugar-sweetened beverages. Granting import licenses for patented drugs was associated with increased access to pharmaceuticals. Implementing trade agreements and associated policies was also correlated with higher cardiovascular disease incidence and higher Body Mass Index (BMI), whilst correlations with tobacco consumption, under-five mortality, maternal mortality, and life expectancy were inconclusive. Overall, the quality of studies is weak or moderately weak, and co-citation analysis revealed a relative isolation of public health from economics. We identified limitations in existing studies which preclude definitive conclusions of the health impacts of regional trade and investment agreements. Few address unobserved confounding, and many possible consequences and mechanisms linking trade and investment agreements to health remain poorly understood. Results from our co-citation analysis suggest scope for greater interdisciplinary collaboration. Notwithstanding these limitations, our results find evidence that trade agreements pose some significant health risks. Health protections in trade and investment treaties may mitigate these impacts.

  16. Evaluation of Eleventh Grade Turkish Pupils' Comprehension of General Chemistry Concepts

    ERIC Educational Resources Information Center

    Belge Can, Hatice; Boz, Yezdan

    2011-01-01

    The main purpose of this study is to evaluate eleventh grade Turkish pupils' comprehension of various general chemistry concepts which in turn enables to investigate chemistry concepts which are easier and harder for students to comprehend. Examining the effect of gender and last semester chemistry course grades on pupils' comprehension of general…

  17. A Comprehensive Two-Dimensional Retention Time Alignment Algorithm To Enhance Chemometric Analysis of Comprehensive Two-Dimensional Separation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, Karisa M.; Wood, Lianna F.; Wright, Bob W.

    2005-12-01

    A comprehensive two-dimensional (2D) retention time alignment algorithm was developed using a novel indexing scheme. The algorithm is termed comprehensive because it functions to correct the entire chromatogram in both dimensions and it preserves the separation information in both dimensions. Although the algorithm is demonstrated by correcting comprehensive two-dimensional gas chromatography (GC x GC) data, the algorithm is designed to correct shifting in all forms of 2D separations, such as LC x LC, LC x CE, CE x CE, and LC x GC. This 2D alignment algorithm was applied to three different data sets composed of replicate GC x GCmore » separations of (1) three 22-component control mixtures, (2) three gasoline samples, and (3) three diesel samples. The three data sets were collected using slightly different temperature or pressure programs to engender significant retention time shifting in the raw data and then demonstrate subsequent corrections of that shifting upon comprehensive 2D alignment of the data sets. Thirty 12-min GC x GC separations from three 22-component control mixtures were used to evaluate the 2D alignment performance (10 runs/mixture). The average standard deviation of the first column retention time improved 5-fold from 0.020 min (before alignment) to 0.004 min (after alignment). Concurrently, the average standard deviation of second column retention time improved 4-fold from 3.5 ms (before alignment) to 0.8 ms (after alignment). Alignment of the 30 control mixture chromatograms took 20 min. The quantitative integrity of the GC x GC data following 2D alignment was also investigated. The mean integrated signal was determined for all components in the three 22-component mixtures for all 30 replicates. The average percent difference in the integrated signal for each component before and after alignment was 2.6%. Singular value decomposition (SVD) was applied to the 22-component control mixture data before and after alignment to show the restoration of trilinearity to the data, since trilinearity benefits chemometric analysis. By applying comprehensive 2D retention time alignment to all three data sets (control mixtures, gasoline samples, and diesel samples), classification by principal component analysis (PCA) substantially improved, resulting in 100% accurate scores clustering.« less

  18. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  19. Qualitative and quantitative characterization of secondary metabolites and carbohydrates in Bai-Hu-Tang using ultraperformance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and ultraperformance liquid chromatography coupled with photodiode array detector.

    PubMed

    Zhong, Wei-Fang; Tong, Wing-Sum; Zhou, Shan-Shan; Yip, Ka-Man; Li, Song-Lin; Zhao, Zhong-Zhen; Xu, Jun; Chen, Hu-Biao

    2017-10-01

    Bai-Hu-Tang (BHT), a classic traditional Chinese medicine (TCM) formula used for clearing heat and promoting body fluid, consists of four traditional Chinese medicines, i.e., Gypsum Fibrosum (Shigao), Anemarrhenae Rhizoma (Zhimu), Glycyrrhizae Radix et Rhizoma Praeparata cum Melle (Zhigancao), and nonglutinous rice (Jingmi). The chemical composition of BHT still remains largely elusive thus far. To qualitatively and quantitatively characterize secondary metabolites and carbohydrates in BHT, here a combination of analytical approaches using ultraperformance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and ultraperformance liquid chromatography coupled with photodiode array detector was developed and validated. A total of 42 secondary metabolites in BHT were tentatively or definitely identified, of which 10 major chemicals were quantified by the extracting ion mode of quadrupole time-of-flight mass spectrometry. Meanwhile, polysaccharides, oligosaccharides, and monosaccharides in BHT were also characterized via sample pretreatment followed by sugar composition analysis. The quantitative results indicated that the determined chemicals accounted for 35.76% of the total extract of BHT, which demonstrated that the study could be instrumental in chemical dissection and quality control of BHT. The research deliverables not only laid the root for further chemical and biological evaluation of BHT, but also provided a comprehensive analytical strategy for chemical characterization of secondary metabolites and carbohydrates in traditional Chinese medicine formulas. Copyright © 2017. Published by Elsevier B.V.

  20. Improving couples' quality of life through a Web-based prostate cancer education intervention.

    PubMed

    Song, Lixin; Rini, Christine; Deal, Allison M; Nielsen, Matthew E; Chang, Hao; Kinneer, Patty; Teal, Randall; Johnson, David C; Dunn, Mary W; Mark, Barbara; Palmer, Mary H

    2015-03-01

    To evaluate the feasibility and acceptability of a newly developed web-based, couple-oriented intervention called Prostate Cancer Education and Resources for Couples (PERC). Quantitative, qualitative, mixed-methods approach. Oncology outpatient clinics at the University of North Carolina (UNC) Lineberger Comprehensive Cancer Center at UNC–Chapel Hill. 26 patients with localized prostate cancer (PCa) and their partners. Pre- and postpilot quantitative assessments and a postpilot qualitative interview were conducted. General and PCa-specific symptoms, quality of life, psychosocial factors, PERC’s ease of use, and web activities. Improvement was shown in some PCa-specific and general symptoms (small effect sizes for patients and small-to-medium effect sizes for partners), overall quality of life, and physical and social domains of quality of life for patients (small effect sizes). Web activity data indicated high PERC use. Qualitative and quantitative analyses indicated that participants found PERC easy to use and understand,as well as engaging, of high quality, and relevant. Overall, participants were satisfied with PERC and reported that PERC improved their knowledge about symptom management and communication as a couple. PERC was a feasible, acceptable method of reducing the side effects of PCa treatment–related symptoms and improving quality of life. PERC has the potential to reduce the negative impacts of symptoms and enhance quality of life for patients with localized PCa and their partners, particularly for those who live in rural areas and have limited access to post-treatment supportive care.

  1. FAA center for aviation systems reliability: an overview

    NASA Astrophysics Data System (ADS)

    Brasche, Lisa J. H.

    1996-11-01

    The FAA Center for Aviation Systems Reliability has as its objectives: to develop quantitative nondestructive evaluation (NDE) methods for aircraft structures and materials, including prototype instrumentation, software, techniques and procedures; and to develop and maintain comprehensive education and training programs specific to the inspection of aviation structures. The program, which includes contributions from Iowa State University, Northwestern University, Wayne State University, Tuskegee University, AlliedSignal Propulsion Engines, General Electric Aircraft Engines and Pratt and Whitney, has been in existence since 1990. Efforts under way include: development of inspection for adhesively bonded structures; detection of corrosion; development of advanced NDE concepts that form the basis for an inspection simulator; improvements of titanium inspection as part of the Engine Titanium Consortium; development of education and training program. An overview of the efforts underway will be provided with focus on those technologies closest to technology transfer.

  2. Numerical study of read scheme in one-selector one-resistor crossbar array

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin

    2015-12-01

    A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.

  3. A multimethod approach to examining usability of Web privacy polices and user agents for specifying privacy preferences.

    PubMed

    Proctor, Robert W; Vu, Kim-Phuong L

    2007-05-01

    Because all research methods have strengths and weaknesses, a multimethod approach often provides the best way to understand human behavior in applied settings. We describe how a multimethod approach was employed in a series of studies designed to examine usability issues associated with two aspects of online privacy: comprehension of privacy policies and configuration of privacy preferences for an online user agent. Archival research, user surveys, data mining, quantitative observations, and controlled experiments each yielded unique findings that, together, contributed to increased understanding of online-privacy issues for users. These findings were used to evaluate the accessibility of Web privacy policies to computer-literate users, determine whether people can configure user agents to achieve specific privacy goals, and discover ways in which the usability of those agents can be improved.

  4. ThunderSTORM: a comprehensive ImageJ plug-in for PALM and STORM data analysis and super-resolution imaging

    PubMed Central

    Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.

    2014-01-01

    Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516

  5. Chemistry of the surface and lower atmosphere of Venus

    NASA Technical Reports Server (NTRS)

    Fegley, B., Jr.; Treiman, A.

    1992-01-01

    A comprehensive overview of the chemical interactions between the atmosphere and surface of Venus is presented. Earth-based, earth-orbital, and spacecraft data on the composition of the atmosphere and surface of Venus are presented and applied to quantitative evaluations of the chemical interactions between carbon, hydrogen, sulfur, chlorine, fluorine, and nitrogen-containing gases and possible minerals on the Venus surface. The calculation results are used to predict stable minerals and mineral assemblages on the Venus surface to determine which, if any, atmospheric gases are buffered by mineral assemblages on the surface, and to critically review and assess prior work on atmosphere-surface chemistry on Venus. It is concluded that the CO2 pressure on Venus is comparable to the CO2 equilibrium partial pressure developed by the calcite + wollastonite + quartz assemblage at the mean Venus surface temperature of 740 K.

  6. RxnSim: a tool to compare biochemical reactions.

    PubMed

    Giri, Varun; Sivakumar, Tadi Venkata; Cho, Kwang Myung; Kim, Tae Yong; Bhaduri, Anirban

    2015-11-15

    : Quantitative assessment of chemical reaction similarity aids database searches, classification of reactions and identification of candidate enzymes. Most methods evaluate reaction similarity based on chemical transformation patterns. We describe a tool, RxnSim, which computes reaction similarity based on the molecular signatures of participating molecules. The tool is able to compare reactions based on similarities of substrates and products in addition to their transformation. It allows masking of user-defined chemical moieties for weighted similarity computations. RxnSim is implemented in R and is freely available from the Comprehensive R Archive Network, CRAN (http://cran.r-project.org/web/packages/RxnSim/). anirban.b@samsung.com or ty76.kim@samsung.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Interim Outcomes Assessment of the Comprehensive Clinical Performance Grid for Student Evaluation.

    ERIC Educational Resources Information Center

    Tolls, Dorothy Bazzinotti; Carlson, Nancy; Wilson, Roger; Richman, Jack

    2001-01-01

    Assessed the viability of the Comprehensive Clinical Performance Grid for Student Evaluation, introduced at The New England College of Optometry in 1996 in clinical student assessment. Analyzed faculty and student feedback and consistency with previous evaluations, between evaluators, and between clinical sites and tracts. Found satisfaction with…

  8. 76 FR 9849 - Comprehensive Environmental Evaluations for Antarctic Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... DEPARTMENT OF STATE [Public Notice 7340] Comprehensive Environmental Evaluations for Antarctic... Environmental Evaluations (CEEs) for activities proposed to be undertaken in Antarctica. Interested members of... on Environmental Protection to the Antarctic Treaty requires the preparation of a CEE for any...

  9. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  10. Robustness analysis of superpixel algorithms to image blur, additive Gaussian noise, and impulse noise

    NASA Astrophysics Data System (ADS)

    Brekhna, Brekhna; Mahmood, Arif; Zhou, Yuanfeng; Zhang, Caiming

    2017-11-01

    Superpixels have gradually become popular in computer vision and image processing applications. However, no comprehensive study has been performed to evaluate the robustness of superpixel algorithms in regard to common forms of noise in natural images. We evaluated the robustness of 11 recently proposed algorithms to different types of noise. The images were corrupted with various degrees of Gaussian blur, additive white Gaussian noise, and impulse noise that either made the object boundaries weak or added extra information to it. We performed a robustness analysis of simple linear iterative clustering (SLIC), Voronoi Cells (VCells), flooding-based superpixel generation (FCCS), bilateral geodesic distance (Bilateral-G), superpixel via geodesic distance (SSS-G), manifold SLIC (M-SLIC), Turbopixels, superpixels extracted via energy-driven sampling (SEEDS), lazy random walk (LRW), real-time superpixel segmentation by DBSCAN clustering, and video supervoxels using partially absorbing random walks (PARW) algorithms. The evaluation process was carried out both qualitatively and quantitatively. For quantitative performance comparison, we used achievable segmentation accuracy (ASA), compactness, under-segmentation error (USE), and boundary recall (BR) on the Berkeley image database. The results demonstrated that all algorithms suffered performance degradation due to noise. For Gaussian blur, Bilateral-G exhibited optimal results for ASA and USE measures, SLIC yielded optimal compactness, whereas FCCS and DBSCAN remained optimal for BR. For the case of additive Gaussian and impulse noises, FCCS exhibited optimal results for ASA, USE, and BR, whereas Bilateral-G remained a close competitor in ASA and USE for Gaussian noise only. Additionally, Turbopixel demonstrated optimal performance for compactness for both types of noise. Thus, no single algorithm was able to yield optimal results for all three types of noise across all performance measures. Conclusively, to solve real-world problems effectively, more robust superpixel algorithms must be developed.

  11. Evaluation of MuSyQ land surface albedo based on LAnd surface Parameters VAlidation System (LAPVAS)

    NASA Astrophysics Data System (ADS)

    Dou, B.; Wen, J.; Xinwen, L.; Zhiming, F.; Wu, S.; Zhang, Y.

    2016-12-01

    satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. However, the accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. A new comprehensive and systemic project of china, called the Remote Sensing Application Network (CRSAN), has been launched recent years. Two subjects of this project is developing a Multi-source data Synergized Quantitative Remote Sensin g Production System ( MuSyQ ) and a Web-based validation system named LAnd surface remote sensing Product VAlidation System (LAPVAS) , which aims to generate a quantitative remote sensing product for ecosystem and environmental monitoring and validate them with a reference validation data and a standard validation system, respectively. Land surface BRDF/albedo is one of product datasets of MuSyQ which has a pentad period with 1km spatial resolution and is derived by Multi-sensor Combined BRDF Inversion ( MCBI ) Model. In this MuSyQ albedo evaluation, a multi-validation strategy is implemented by LAPVAS, including directly and multi-scale validation with field measured albedo and cross validation with MODIS albedo product with different land cover. The results reveal that MuSyQ albedo data with a 5-day temporal resolution is in higher sensibility and accuracy during land cover change period, e.g. snowing. But results without regard to snow or changed land cover, MuSyQ albedo generally is in similar accuracy with MODIS albedo and meet the climate modeling requirement of an absolute accuracy of 0.05.

  12. A School-Based Evaluation Model for Accelerating the Education of Students At-Risk.

    ERIC Educational Resources Information Center

    Fetterman, David M.; Haertel, Edward H.

    This paper presents ideas for the development and utilization of a comprehensive evaluation plan for an accelerated school. It contains information about the purposes of a comprehensive evaluation, the evaluation design, and the kinds of data that might be gathered and used. The first section, "An Approach to Evaluation: Multiple Purposes and…

  13. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  14. The neural basis of hand gesture comprehension: A meta-analysis of functional magnetic resonance imaging studies.

    PubMed

    Yang, Jie; Andric, Michael; Mathew, Mili M

    2015-10-01

    Gestures play an important role in face-to-face communication and have been increasingly studied via functional magnetic resonance imaging. Although a large amount of data has been provided to describe the neural substrates of gesture comprehension, these findings have never been quantitatively summarized and the conclusion is still unclear. This activation likelihood estimation meta-analysis investigated the brain networks underpinning gesture comprehension while considering the impact of gesture type (co-speech gestures vs. speech-independent gestures) and task demand (implicit vs. explicit) on the brain activation of gesture comprehension. The meta-analysis of 31 papers showed that as hand actions, gestures involve a perceptual-motor network important for action recognition. As meaningful symbols, gestures involve a semantic network for conceptual processing. Finally, during face-to-face interactions, gestures involve a network for social emotive processes. Our finding also indicated that gesture type and task demand influence the involvement of the brain networks during gesture comprehension. The results highlight the complexity of gesture comprehension, and suggest that future research is necessary to clarify the dynamic interactions among these networks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Quantitation of Specific Barley, Rye, and Oat Marker Peptides by Targeted Liquid Chromatography-Mass Spectrometry To Determine Gluten Concentrations.

    PubMed

    Schalk, Kathrin; Koehler, Peter; Scherf, Katharina Anne

    2018-04-04

    Celiac disease is triggered by the ingestion of gluten from wheat, barley, rye, and possibly oats. Gluten is quantitated by DNA-based methods or enzyme-linked immunosorbent assays (ELISAs). ELISAs mostly detect the prolamin fraction and potentially over- or underestimate gluten contents. Therefore, a new independent method is required to comprehensively detect gluten. A targeted liquid chromatography-tandem mass spectrometry method was developed to quantitate seven barley, seven rye, and three oat marker peptides derived from each gluten protein fraction (prolamin and glutelin) and type (barley, B-, C-, D-, and γ-hordeins; rye, γ-75k-, γ-40k-, ω-, and HMW-secalins). The quantitation of each marker peptide in the chymotryptic digest of a defined amount of the respective reference gluten protein type resulted in peptide-specific yields, which enabled the conversion of peptide into protein concentrations. This method was applied to quantitate gluten in samples from the brewing process, in raw materials for sourdough fermentation, and in dried sourdoughs.

  16. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  17. Recommendations for the standardization of bone marrow disease assessment and reporting in children with neuroblastoma on behalf of the International Neuroblastoma Response Criteria Bone Marrow Working Group.

    PubMed

    Burchill, Susan A; Beiske, Klaus; Shimada, Hiroyuki; Ambros, Peter F; Seeger, Robert; Tytgat, Godelieve A M; Brock, Penelope R; Haber, Michelle; Park, Julie R; Berthold, Frank

    2017-04-01

    The current study was conducted to expedite international standardized reporting of bone marrow disease in children with neuroblastoma and to improve equivalence of care. A multidisciplinary International Neuroblastoma Response Criteria Bone Marrow Working Group was convened by the US National Cancer Institute in January 2012 with representation from Europe, North America, and Australia. Practical transferable recommendations to standardize the reporting of bone marrow disease were developed. To the authors' knowledge, the current study is the first to comprehensively present consensus criteria for the collection, analysis, and reporting of the percentage area of bone marrow parenchyma occupied by tumor cells in trephine-biopsies. The quantitative analysis of neuroblastoma content in bone marrow aspirates by immunocytology and reverse transcriptase-quantitative polymerase chain reaction are revised. The inclusion of paired-like homeobox 2b (PHOX2B) for immunohistochemistry and reverse transcriptase-quantitative polymerase chain reaction is recommended. Recommendations for recording bone marrow response are provided. The authors endorse the quantitative assessment of neuroblastoma cell content in bilateral core needle biopsies-trephines and aspirates in all children with neuroblastoma, with the exception of infants, in whom the evaluation of aspirates alone is advised. It is interesting to note that 5% disease is accepted as an internationally achievable level for disease assessment. The quantitative assessment of neuroblastoma cells is recommended to provide data from which evidence-based numerical criteria for the reporting of bone marrow response can be realized. This is particularly important in the minimal disease setting and when neuroblastoma detection in bone marrow is intermittent, where clinical impact has yet to be validated. The wide adoption of these harmonized criteria will enhance the ability to compare outcomes from different trials and facilitate collaborative trial design. Cancer 2017;123:1095-1105. © 2016 American Cancer Society. © 2016 American Cancer Society.

  18. Navigating spatial and temporal complexity in developing a long-term land use database for an agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    No comprehensive protocols exist for the collection, standardization, and storage of agronomic management information into a database that preserves privacy, maintains data uncertainty, and translates everyday decisions into quantitative values. This manuscript describes the development of a databas...

  19. Genetic basis for evolved tolerance to dioxin-like pollutants in wild Atlantic killifish: more than the aryl hydrocarbon receptor

    EPA Science Inventory

    Atlantic killifish (Fundulus heteroclitus) resident to some US urban and industrialized estuaries demonstrate recently evolved and extreme tolerance to toxic dioxin-like compounds (DLCs). Here we provide an unusually comprehensive accounting (69%) through Quantitative Trait Locu...

  20. Effectiveness of Facebook in English Language Learning: A Case Study

    ERIC Educational Resources Information Center

    Faryadi, Qais

    2017-01-01

    The prime objective of this research was to investigate whether Facebook helped undergraduate students of Universiti Sains Islam Malaysia (USIM) improve their English language proficiency, critical thinking, comprehension skills, and motivation. A triangulation method (quantitative, qualitative, and descriptive) was employed in the investigation.…

  1. Understanding and Teaching Complex Texts

    ERIC Educational Resources Information Center

    Fisher, Douglas; Frey, Nancy

    2014-01-01

    Teachers in today's classrooms struggle every day to design instructional interventions that would build students' reading skills and strategies in order to ensure their comprehension of complex texts. Text complexity can be determined in both qualitative and quantitative ways. In this article, the authors describe various innovative…

  2. Selecting Cases for Intensive Analysis: A Diversity of Goals and Methods

    ERIC Educational Resources Information Center

    Gerring, John; Cojocaru, Lee

    2016-01-01

    This study revisits the task of case selection in case study research, proposing a new typology of strategies that is explicit, disaggregated, and relatively comprehensive. A secondary goal is to explore the prospects for case selection by "algorithm," aka "ex ante," "automatic," "quantitative,"…

  3. 34 CFR 668.142 - Special definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...

  4. 34 CFR 668.142 - Special definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...

  5. 34 CFR 668.142 - Special definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...

  6. 34 CFR 668.142 - Special definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...

  7. Symptom validity test performance and consistency of self-reported memory functioning of Operation Enduring Freedom/Operation Iraqi freedom veterans with positive Veteran Health Administration Comprehensive Traumatic Brain Injury evaluations.

    PubMed

    Russo, Arthur C

    2012-12-01

    Operation Enduring Freedom and Operation Iraqi Freedom combat veterans given definite diagnoses of mild Traumatic Brain Injury (TBI) during the Veteran Health Administration (VHA) Comprehensive TBI evaluation and reporting no post-deployment head injury were examined to assess (a) consistency of self-reported memory impairment and (b) symptom validity test (SVT) performance via a two-part study. Study 1 found that while 49 of 50 veterans reported moderate to very severe memory impairment during the VHA Comprehensive TBI evaluation, only 7 had reported any memory problem at the time of their Department of Defense (DOD) post-deployment health assessment. Study 2 found that of 38 veterans referred for neuropsychological evaluations following a positive VHA Comprehensive TBI evaluation, 68.4% failed the Word Memory Test, a forced choice memory recognition symptom validity task. Together, these studies raise questions concerning the use of veteran symptom self-report for TBI assessments and argue for the inclusion of SVTs and the expanded use of contemporaneous DOD records to improve the diagnostic accuracy of the VHA Comprehensive TBI evaluation.

  8. An evaluation of the nursing success program: reading comprehension, graduation rates, and diversity.

    PubMed

    Symes, Lene; Tart, Kathryn; Travis, Lucille

    2005-01-01

    The Nursing Success Program was developed to enhance retention of baccalaureate nursing students. Reading comprehension scores are used to identify students who are at risk for failure and direct them into the retention program that addresses their skill deficits. To evaluate the program, the authors assessed reading comprehension, graduation rates, and ethnic diversity.

  9. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  10. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  11. Development of an interprofessional lean facilitator assessment scale.

    PubMed

    Bravo-Sanchez, Cindy; Dorazio, Vincent; Denmark, Robert; Heuer, Albert J; Parrott, J Scott

    2018-05-01

    High reliability is important for optimising quality and safety in healthcare organisations. Reliability efforts include interprofessional collaborative practice (IPCP) and Lean quality/process improvement strategies, which require skilful facilitation. Currently, no validated Lean facilitator assessment tool for interprofessional collaboration exists. This article describes the development and pilot evaluation of such a tool; the Interprofessional Lean Facilitator Assessment Scale (ILFAS), which measures both technical and 'soft' skills, which have not been measured in other instruments. The ILFAS was developed using methodologies and principles from Lean/Shingo, IPCP, metacognition research and Bloom's Taxonomy of Learning Domains. A panel of experts confirmed the initial face validity of the instrument. Researchers independently assessed five facilitators, during six Lean sessions. Analysis included quantitative evaluation of rater agreement. Overall inter-rater agreement of the assessment of facilitator performance was high (92%), and discrepancies in the agreement statistics were analysed. Face and content validity were further established, and usability was evaluated, through primary stakeholder post-pilot feedback, uncovering minor concerns, leading to tool revision. The ILFAS appears comprehensive in the assessment of facilitator knowledge, skills, abilities, and may be useful in the discrimination between facilitators of different skill levels. Further study is needed to explore instrument performance and validity.

  12. Epidemiology of Meningitis in an HIV-Infected Ugandan Cohort

    PubMed Central

    Rajasingham, Radha; Rhein, Joshua; Klammer, Kate; Musubire, Abdu; Nabeta, Henry; Akampurira, Andrew; Mossel, Eric C.; Williams, Darlisha A.; Boxrud, Dave J.; Crabtree, Mary B.; Miller, Barry R.; Rolfes, Melissa A.; Tengsupakul, Supatida; Andama, Alfred O.; Meya, David B.; Boulware, David R.

    2015-01-01

    There is limited understanding of the epidemiology of meningitis among human immunodeficiency virus (HIV)-infected populations in sub-Saharan Africa. We conducted a prospective cohort study of HIV-infected adults with suspected meningitis in Uganda, to comprehensively evaluate the etiologies of meningitis. Intensive cerebrospiral fluid (CSF) testing was performed to evaluate for bacterial, viral, fungal, and mycobacterial etiologies, including neurosyphilis,16s ribosomal DNA (rDNA) polymerase chain reaction (PCR) for bacteria, Plex-ID broad viral assay, quantitative-PCR for HSV-1/2, cytomegalovirus (CMV), Epstein–Barr virus (EBV), and Toxoplasma gondii; reverse transcription-PCR (RT-PCR) for Enteroviruses and arboviruses, and Xpert MTB/RIF assay. Cryptococcal meningitis accounted for 60% (188 of 314) of all causes of meningitis. Of 117 samples sent for viral PCR, 36% were EBV positive. Among cryptococcal antigen negative patients, the yield of Xpert MTB/RIF assay was 22% (8 of 36). After exclusion of cryptococcosis and bacterial meningitis, 61% (43 of 71) with an abnormal CSF profile had no definitive diagnosis. Exploration of new TB diagnostics and diagnostic algorithms for evaluation of meningitis in resource-limited settings remains needed, and implementation of cryptococcal diagnostics is critical. PMID:25385864

  13. Epidemiology of meningitis in an HIV-infected Ugandan cohort.

    PubMed

    Rajasingham, Radha; Rhein, Joshua; Klammer, Kate; Musubire, Abdu; Nabeta, Henry; Akampurira, Andrew; Mossel, Eric C; Williams, Darlisha A; Boxrud, Dave J; Crabtree, Mary B; Miller, Barry R; Rolfes, Melissa A; Tengsupakul, Supatida; Andama, Alfred O; Meya, David B; Boulware, David R

    2015-02-01

    There is limited understanding of the epidemiology of meningitis among human immunodeficiency virus (HIV)-infected populations in sub-Saharan Africa. We conducted a prospective cohort study of HIV-infected adults with suspected meningitis in Uganda, to comprehensively evaluate the etiologies of meningitis. Intensive cerebrospiral fluid (CSF) testing was performed to evaluate for bacterial, viral, fungal, and mycobacterial etiologies, including neurosyphilis,16s ribosomal DNA (rDNA) polymerase chain reaction (PCR) for bacteria, Plex-ID broad viral assay, quantitative-PCR for HSV-1/2, cytomegalovirus (CMV), Epstein-Barr virus (EBV), and Toxoplasma gondii; reverse transcription-PCR (RT-PCR) for Enteroviruses and arboviruses, and Xpert MTB/RIF assay. Cryptococcal meningitis accounted for 60% (188 of 314) of all causes of meningitis. Of 117 samples sent for viral PCR, 36% were EBV positive. Among cryptococcal antigen negative patients, the yield of Xpert MTB/RIF assay was 22% (8 of 36). After exclusion of cryptococcosis and bacterial meningitis, 61% (43 of 71) with an abnormal CSF profile had no definitive diagnosis. Exploration of new TB diagnostics and diagnostic algorithms for evaluation of meningitis in resource-limited settings remains needed, and implementation of cryptococcal diagnostics is critical. © The American Society of Tropical Medicine and Hygiene.

  14. Quantitative multimodality imaging in cancer research and therapy.

    PubMed

    Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad

    2014-11-01

    Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.

  15. Aquifer water abundance evaluation using a fuzzy- comprehensive weighting method

    NASA Astrophysics Data System (ADS)

    Wei, Z.

    2016-08-01

    Aquifer water abundance evaluation is a highly relevant issue that has been researched for many years. Despite prior research, problems with the conventional evaluation method remain. This paper establishes an aquifer water abundance evaluation method that combines fuzzy evaluation with a comprehensive weighting method to overcome both the subjectivity and lack of conformity in determining weight by pure data analysis alone. First, this paper introduces the principle of a fuzzy-comprehensive weighting method. Second, the example of well field no. 3 (of a coalfield) is used to illustrate the method's process. The evaluation results show that this method is can more suitably meet the real requirements of aquifer water abundance assessment, leading to more precise and accurate evaluations. Ultimately, this paper provides a new method for aquifer water abundance evaluation.

  16. Task-oriented evaluation of electronic medical records systems: development and validation of a questionnaire for physicians

    PubMed Central

    2004-01-01

    Background Evaluation is a challenging but necessary part of the development cycle of clinical information systems like the electronic medical records (EMR) system. It is believed that such evaluations should include multiple perspectives, be comparative and employ both qualitative and quantitative methods. Self-administered questionnaires are frequently used as a quantitative evaluation method in medical informatics, but very few validated questionnaires address clinical use of EMR systems. Methods We have developed a task-oriented questionnaire for evaluating EMR systems from the clinician's perspective. The key feature of the questionnaire is a list of 24 general clinical tasks. It is applicable to physicians of most specialties and covers essential parts of their information-oriented work. The task list appears in two separate sections, about EMR use and task performance using the EMR, respectively. By combining these sections, the evaluator may estimate the potential impact of the EMR system on health care delivery. The results may also be compared across time, site or vendor. This paper describes the development, performance and validation of the questionnaire. Its performance is shown in two demonstration studies (n = 219 and 80). Its content is validated in an interview study (n = 10), and its reliability is investigated in a test-retest study (n = 37) and a scaling study (n = 31). Results In the interviews, the physicians found the general clinical tasks in the questionnaire relevant and comprehensible. The tasks were interpreted concordant to their definitions. However, the physicians found questions about tasks not explicitly or only partially supported by the EMR systems difficult to answer. The two demonstration studies provided unambiguous results and low percentages of missing responses. In addition, criterion validity was demonstrated for a majority of task-oriented questions. Their test-retest reliability was generally high, and the non-standard scale was found symmetric and ordinal. Conclusion This questionnaire is relevant for clinical work and EMR systems, provides reliable and interpretable results, and may be used as part of any evaluation effort involving the clinician's perspective of an EMR system. PMID:15018620

  17. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    PubMed

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  18. Task-oriented evaluation of electronic medical records systems: development and validation of a questionnaire for physicians.

    PubMed

    Laerum, Hallvard; Faxvaag, Arild

    2004-02-09

    Evaluation is a challenging but necessary part of the development cycle of clinical information systems like the electronic medical records (EMR) system. It is believed that such evaluations should include multiple perspectives, be comparative and employ both qualitative and quantitative methods. Self-administered questionnaires are frequently used as a quantitative evaluation method in medical informatics, but very few validated questionnaires address clinical use of EMR systems. We have developed a task-oriented questionnaire for evaluating EMR systems from the clinician's perspective. The key feature of the questionnaire is a list of 24 general clinical tasks. It is applicable to physicians of most specialties and covers essential parts of their information-oriented work. The task list appears in two separate sections, about EMR use and task performance using the EMR, respectively. By combining these sections, the evaluator may estimate the potential impact of the EMR system on health care delivery. The results may also be compared across time, site or vendor. This paper describes the development, performance and validation of the questionnaire. Its performance is shown in two demonstration studies (n = 219 and 80). Its content is validated in an interview study (n = 10), and its reliability is investigated in a test-retest study (n = 37) and a scaling study (n = 31). In the interviews, the physicians found the general clinical tasks in the questionnaire relevant and comprehensible. The tasks were interpreted concordant to their definitions. However, the physicians found questions about tasks not explicitly or only partially supported by the EMR systems difficult to answer. The two demonstration studies provided unambiguous results and low percentages of missing responses. In addition, criterion validity was demonstrated for a majority of task-oriented questions. Their test-retest reliability was generally high, and the non-standard scale was found symmetric and ordinal. This questionnaire is relevant for clinical work and EMR systems, provides reliable and interpretable results, and may be used as part of any evaluation effort involving the clinician's perspective of an EMR system.

  19. A Trial of Nursing Cost Accounting using Nursing Practice Data on a Hospital Information System.

    PubMed

    Miyahira, Akiko; Tada, Kazuko; Ishima, Masatoshi; Nagao, Hidenori; Miyamoto, Tadashi; Nakagawa, Yoshiaki; Takemura, Tadamasa

    2015-01-01

    Hospital administration is very important and many hospitals carry out activity-based costing under comprehensive medicine. However, nursing cost is unclear, because nursing practice is expanding both quantitatively and qualitatively and it is difficult to grasp all nursing practices, and nursing cost is calculated in many cases comprehensively. On the other hand, a nursing information system (NIS) is implemented in many hospitals in Japan and we are beginning to get nursing practical data. In this paper, we propose a nursing cost accounting model and we simulate a cost by nursing contribution using NIS data.

  20. Quantitative template for subtyping primary progressive aphasia.

    PubMed

    Mesulam, Marsel; Wieneke, Christina; Rogalski, Emily; Cobia, Derin; Thompson, Cynthia; Weintraub, Sandra

    2009-12-01

    The syndrome of primary progressive aphasia (PPA) is diagnosed when a gradual failure of word usage or comprehension emerges as the principal feature of a neurodegenerative disease. To provide a quantitative algorithm for classifying PPA into agrammatic (PPA-G), semantic (PPA-S), and logopenic (PPA-L) variants, each of which is known to have a different probability of association with Alzheimer disease vs frontotemporal lobar degeneration. Prospective study. University medical center. Sixteen consecutively enrolled patients with PPA who underwent neuropsychological testing and magnetic resonance imaging recruited nationally in the United States as part of a longitudinal study. A 2-dimensional template that reflects performance on tests of syntax (Northwestern Anagram Test) and lexical semantics (Peabody Picture Vocabulary Test-Fourth Edition) classified all 16 patients in concordance with a clinical diagnosis that had been made before the administration of quantitative tests. All 3 PPA subtypes had distinctly asymmetrical atrophy of the left perisylvian language network. Each subtype also had distinctive peak atrophy sites: PPA-G in the inferior frontal gyrus (Broca area), PPA-S in the anterior temporal lobe, and PPA-L in Brodmann area 37. Once an accurate root diagnosis of PPA is made, subtyping can be quantitatively guided using a 2-dimensional template based on orthogonal tasks of grammatical competence and word comprehension. Although the choice of tasks and the precise cutoff levels may need to be adjusted to fit linguistic and educational backgrounds, these 16 patients demonstrate the feasibility of using a simple algorithm for clinicoanatomical classification in PPA. Prospective studies will show whether this subtyping can improve clinical prediction of the underlying neuropathologic condition.

Top