Sample records for quantitative integrated assessment

  1. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  2. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  3. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  4. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  6. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  7. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  8. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  9. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. © 2016 K. Hoffman, S. Leupen, et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. A quantitative integrated assessment of pollution prevention achieved by integrated pollution prevention control licensing.

    PubMed

    Styles, David; O'Brien, Kieran; Jones, Michael B

    2009-11-01

    This paper presents an innovative, quantitative assessment of pollution avoidance attributable to environmental regulation enforced through integrated licensing, using Ireland's pharmaceutical-manufacturing sector as a case study. Emissions data reported by pharmaceutical installations were aggregated into a pollution trend using an Environmental Emissions Index (EEI) based on Lifecycle Assessment methodologies. Complete sectoral emissions data from 2001 to 2007 were extrapolated back to 1995, based on available data. Production volume data were used to derive a sectoral production index, and determine 'no-improvement' emission trends, whilst questionnaire responses from 20 industry representatives were used to quantify the contribution of integrated licensing to emission avoidance relative to these trends. Between 2001 and 2007, there was a 40% absolute reduction in direct pollution from 27 core installations, and 45% pollution avoidance relative to hypothetical 'no-improvement' pollution. It was estimated that environmental regulation avoided 20% of 'no-improvement' pollution, in addition to 25% avoidance under business-as-usual. For specific emissions, avoidance ranged from 14% and 30 kt a(-1) for CO(2) to 88% and 598 t a(-1) for SO(x). Between 1995 and 2007, there was a 59% absolute reduction in direct pollution, and 76% pollution avoidance. Pollution avoidance was dominated by reductions in emissions of VOCs, SO(x) and NO(x) to air, and emissions of heavy metals to water. Pollution avoidance of 35% was attributed to integrated licensing, ranging from between 8% and 2.9 t a(-1) for phosphorus emissions to water to 49% and 3143 t a(-1) for SO(x) emissions to air. Environmental regulation enforced through integrated licensing has been the major driver of substantial pollution avoidance achieved by Ireland's pharmaceutical sector - through emission limit values associated with Best Available Techniques, emissions monitoring and reporting requirements, and

  11. Integrated Quantitative Cancer Risk Assessment of Inorganic Arsenic

    EPA Science Inventory

    This paper attempts to make an integrated risk assessment of arsenic, using data on humans exposed to arsenic via inhalation and ingestion. he data useful for making an integrated analysis and data gaps are discussed. rsenic provides a rare opportunity to compare the cancer risk ...

  12. Assessing integrity of insect RNA

    USDA-ARS?s Scientific Manuscript database

    Assessing total RNA integrity is important for the success of downstream RNA applications. The 2100 Bioanalyzer system with the RNA Integrity Number (RIN) provides a quantitative measure of RNA degradation. Although RINs may not be ascertained for RNA from all organisms, namely those with unusual or...

  13. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  14. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  15. EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS

    EPA Science Inventory

    NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...

  16. Integrated narrative assessment exemplification: a leukaemia case history.

    PubMed

    Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sollami, Alfonso; Taffurelli, Chiara

    2017-07-18

    In the Integrated Narrative Nursing Assessment (INNA), the Evidence-Based Nursing Model is integrated with the Narrative-Based Nursing Model. The INNA makes use of quantitative instruments, arising from the natural sciences as well as of qualitative ones, arising from the human achieving results of standardization and reproducibility, as well as of customization and uniqueness. Accordingly, the purpose of this work is to exemplify the thinking process of and the method adopted by a nurse adopting an integrated narrative assessment in the evaluation of a patient. The patient suffered from acute myeloid leukaemia, treated with chemotherapy. Her nurse worked in a haematology ward in a North Italy Hospital. The nurse had previous experience in conducting the assessment according to INNA. Based on patient's characteristics, the nurse chose to use the narration (to explore needs from their subjective perception) and the scales (to measure them objectively) among the various assessment instruments provided by the INNA. The resultant integrated outcomes helped the nurse to have a comprehensive overview of the person's health-care needs and their connections. These outcomes derive from the integration of narrative information with those obtained from the scales, which in this paper have shown consistent results. It is very difficult to reach this complexity by considering qualitative and quantitative assessment strategies as mutually foreclosing, given that both emerged as being very useful in identifying, understanding and measuring the needs of the assisted person. Then they both could be used to design a customized intervention, encouraging new connections between disease, illness, sickness and everyday life.

  17. Integrating quantitative thinking into an introductory biology course improves students' mathematical reasoning in biological contexts.

    PubMed

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students' understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students' inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students' biology learning.

  18. Integrating Quantitative Thinking into an Introductory Biology Course Improves Students’ Mathematical Reasoning in Biological Contexts

    PubMed Central

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students’ apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students’ understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students’ inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students’ biology learning. PMID:24591504

  19. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  20. Toward Integration: From Quantitative Biology to Mathbio-Biomath?

    ERIC Educational Resources Information Center

    Marsteller, Pat; de Pillis, Lisette; Findley, Ann; Joplin, Karl; Pelesko, John; Nelson, Karen; Thompson, Katerina; Usher, David; Watkins, Joseph

    2010-01-01

    In response to the call of "BIO2010" for integrating quantitative skills into undergraduate biology education, 30 Howard Hughes Medical Institute (HHMI) Program Directors at the 2006 HHMI Program Directors Meeting established a consortium to investigate, implement, develop, and disseminate best practices resulting from the integration of math and…

  1. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  2. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  3. ATD-1 Operational Integration Assessment Final Report

    NASA Technical Reports Server (NTRS)

    Witzberger, Kevin E.; Sharma, Shivanjli; Martin, Lynn Hazel; Wynnyk, Mitch; McGarry, Katie

    2015-01-01

    The FAA and NASA conducted an Operational Integration Assessment (OIA) of a prototype Terminal Sequencing and Spacing (formerly TSS, now TSAS) system at the FAA's William J. Hughes Technical Center (WJHTC). The OIA took approximately one year to plan and execute, culminating in a formal data collection, referred to as the Run for Record, from May 12-21, 2015. This report presents quantitative and qualitative results from the Run for Record.

  4. Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?

    PubMed

    Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C

    2017-09-01

    Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not

  5. A quantitative framework for assessing ecological resilience

    EPA Science Inventory

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  6. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  7. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  8. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  9. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for

  10. The new AP Physics exams: Integrating qualitative and quantitative reasoning

    NASA Astrophysics Data System (ADS)

    Elby, Andrew

    2015-04-01

    When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.

  11. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  12. Quantitative myocardial blood flow imaging with integrated time-of-flight PET-MR.

    PubMed

    Kero, Tanja; Nordström, Jonny; Harms, Hendrik J; Sörensen, Jens; Ahlström, Håkan; Lubberink, Mark

    2017-12-01

    The use of integrated PET-MR offers new opportunities for comprehensive assessment of cardiac morphology and function. However, little is known on the quantitative accuracy of cardiac PET imaging with integrated time-of-flight PET-MR. The aim of the present work was to validate the GE Signa PET-MR scanner for quantitative cardiac PET perfusion imaging. Eleven patients (nine male; mean age 59 years; range 46-74 years) with known or suspected coronary artery disease underwent 15 O-water PET scans at rest and during adenosine-induced hyperaemia on a GE Discovery ST PET-CT and a GE Signa PET-MR scanner. PET-MR images were reconstructed using settings recommended by the manufacturer, including time-of-flight (TOF). Data were analysed semi-automatically using Cardiac VUer software, resulting in both parametric myocardial blood flow (MBF) images and segment-based MBF values. Correlation and agreement between PET-CT-based and PET-MR-based MBF values for all three coronary artery territories were assessed using regression analysis and intra-class correlation coefficients (ICC). In addition to the cardiac PET-MR reconstruction protocol as recommended by the manufacturer, comparisons were made using a PET-CT resolution-matched reconstruction protocol both without and with TOF to assess the effect of time-of-flight and reconstruction parameters on quantitative MBF values. Stress MBF data from one patient was excluded due to movement during the PET-CT scanning. Mean MBF values at rest and stress were (0.92 ± 0.12) and (2.74 ± 1.37) mL/g/min for PET-CT and (0.90 ± 0.23) and (2.65 ± 1.15) mL/g/min for PET-MR (p = 0.33 and p = 0.74). ICC between PET-CT-based and PET-MR-based regional MBF was 0.98. Image quality was improved with PET-MR as compared to PET-CT. ICC between PET-MR-based regional MBF with and without TOF and using different filter and reconstruction settings was 1.00. PET-MR-based MBF values correlated well with PET-CT-based MBF values and

  13. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    PubMed

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  15. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  16. Toward Integration: From Quantitative Biology to Mathbio-Biomath?

    PubMed Central

    de Pillis, Lisette; Findley, Ann; Joplin, Karl; Pelesko, John; Nelson, Karen; Thompson, Katerina; Usher, David; Watkins, Joseph

    2010-01-01

    In response to the call of BIO2010 for integrating quantitative skills into undergraduate biology education, 30 Howard Hughes Medical Institute (HHMI) Program Directors at the 2006 HHMI Program Directors Meeting established a consortium to investigate, implement, develop, and disseminate best practices resulting from the integration of math and biology. With the assistance of an HHMI-funded mini-grant, led by Karl Joplin of East Tennessee State University, and support in institutional HHMI grants at Emory and University of Delaware, these institutions held a series of summer institutes and workshops to document progress toward and address the challenges of implementing a more quantitative approach to undergraduate biology education. This report summarizes the results of the four summer institutes (2007–2010). The group developed four draft white papers, a wiki site, and a listserv. One major outcome of these meetings is this issue of CBE—Life Sciences Education, which resulted from proposals at our 2008 meeting and a January 2009 planning session. Many of the papers in this issue emerged from or were influenced by these meetings. PMID:20810946

  17. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  18. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  19. Toward a quantitative approach to migrants integration

    NASA Astrophysics Data System (ADS)

    Barra, A.; Contucci, P.

    2010-03-01

    Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.

  20. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  1. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  2. Assessment of Renal Hemodynamics and Oxygenation by Simultaneous Magnetic Resonance Imaging (MRI) and Quantitative Invasive Physiological Measurements.

    PubMed

    Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas

    2016-01-01

    In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.

  3. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  4. Physiologic Basis for Understanding Quantitative Dehydration Assessment

    DTIC Science & Technology

    2012-01-01

    Perspective Physiologic basis for understanding quantitative dehydration assessment1–4 Samuel N Cheuvront, Robert W Kenefick, Nisha Charkoudian, and...Michael N Sawka ABSTRACT Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance...review the physiologic basis for understanding quantitative dehydration as- sessment. We highlight how phenomenologic interpretations of de- hydration

  5. Development of Improved Caprock Integrity and Risk Assessment Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruno, Michael

    GeoMechanics Technologies has completed a geomechanical caprock integrity analysis and risk assessment study funded through the US Department of Energy. The project included: a detailed review of historical caprock integrity problems experienced in the natural gas storage industry; a theoretical description and documentation of caprock integrity issues; advanced coupled transport flow modelling and geomechanical simulation of three large-scale potential geologic sequestration sites to estimate geomechanical effects from CO₂ injection; development of a quantitative risk and decision analysis tool to assess caprock integrity risks; and, ultimately the development of recommendations and guidelines for caprock characterization and CO₂ injection operating practices. Historicalmore » data from gas storage operations and CO₂ sequestration projects suggest that leakage and containment incident risks are on the order of 10-1 to 10-2, which is higher risk than some previous studies have suggested for CO₂. Geomechanical analysis, as described herein, can be applied to quantify risks and to provide operating guidelines to reduce risks. The risk assessment tool developed for this project has been applied to five areas: The Wilmington Graben offshore Southern California, Kevin Dome in Montana, the Louden Field in Illinois, the Sleipner CO₂ sequestration operation in the North Sea, and the In Salah CO₂ sequestration operation in North Africa. Of these five, the Wilmington Graben area represents the highest relative risk while the Kevin Dome area represents the lowest relative risk.« less

  6. Quantitative analysis for peripheral vascularity assessment based on clinical photoacoustic and ultrasound images

    NASA Astrophysics Data System (ADS)

    Murakoshi, Dai; Hirota, Kazuhiro; Ishii, Hiroyasu; Hashimoto, Atsushi; Ebata, Tetsurou; Irisawa, Kaku; Wada, Takatsugu; Hayakawa, Toshiro; Itoh, Kenji; Ishihara, Miya

    2018-02-01

    Photoacoustic (PA) imaging technology is expected to be applied to clinical assessment for peripheral vascularity. We started a clinical evaluation with the prototype PA imaging system we recently developed. Prototype PA imaging system was composed with in-house Q-switched Alexandrite laser system which emits short-pulsed laser with 750 nm wavelength, handheld ultrasound transducer where illumination optics were integrated and signal processing for PA image reconstruction implemented in the clinical ultrasound (US) system. For the purpose of quantitative assessment of PA images, an image analyzing function has been developed and applied to clinical PA images. In this analyzing function, vascularity derived from PA signal intensity ranged for prescribed threshold was defined as a numerical index of vessel fulfillment and calculated for the prescribed region of interest (ROI). Skin surface was automatically detected by utilizing B-mode image acquired simultaneously with PA image. Skinsurface position is utilized to place the ROI objectively while avoiding unwanted signals such as artifacts which were imposed due to melanin pigment in the epidermal layer which absorbs laser emission and generates strong PA signals. Multiple images were available to support the scanned image set for 3D viewing. PA images for several fingers of patients with systemic sclerosis (SSc) were quantitatively assessed. Since the artifact region is trimmed off in PA images, the visibility of vessels with rather low PA signal intensity on the 3D projection image was enhanced and the reliability of the quantitative analysis was improved.

  7. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  8. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  9. Quantitative Assessment of Free Flap Viability with CEUS Using an Integrated Perfusion Software.

    PubMed

    Geis, S; Klein, S; Prantl, L; Dolderer, J; Lamby, P; Jung, E-M

    2015-12-01

    New treatment strategies in oncology and trauma surgery lead to an increasing demand for soft tissue reconstruction with free tissue transfer. In previous studies, CEUS was proven to detect early flap failure. The aim of this study was to detect and quantify vascular disturbances after free flap transplantation using a fast integrated perfusion software tool. From 2011 to 2013, 33 patients were examined by one experienced radiologist using CEUS after a bolus injection of 1-2.4 ml of SonoVue(®). Flap perfusion was analysed qualitatively regarding contrast defects or delayed wash-in. Additionally, an integrated semi-quantitative analysis using time-intensity curve analysis (TIC) was performed. TIC analysis of the transplant was conducted on a centimetre-by-centimetre basis up to a penetration depth of 4 cm. The 2 perfusion parameters "Time to PEAK" and "Area under the Curve" were compared in patients without complications vs. patients with minor complications or complete flap loss to figure out significant differences. TtoPk is given in seconds (s) and Area is given in relative units (rU) Results: A regular postoperative process was observed in 26 (79%) patients. In contrast, 5 (15%) patients with partial superficial flap necrosis, 1 patient (3%) with complete flap loss and 1 patient (3%) with haematoma were observed. TtoPk revealed no significant differences, whereas Area revealed significantly lower perfusion values in the corresponding areas in patients with complications. The critical threshold for sufficient flap perfusion was set below 150 rU. In conclusion, CEUS is a mobile and cost-effective opportunity to quantify tissue perfusion and can even be used almost without any restrictions in multi-morbid patients with renal and hepatic failure. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  11. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  12. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  13. Questionnaire-based person trip visualization and its integration to quantitative measurements in Myanmar

    NASA Astrophysics Data System (ADS)

    Kimijiama, S.; Nagai, M.

    2016-06-01

    With telecommunication development in Myanmar, person trip survey is supposed to shift from conversational questionnaire to GPS survey. Integration of both historical questionnaire data to GPS survey and visualizing them are very important to evaluate chronological trip changes with socio-economic and environmental events. The objectives of this paper are to: (a) visualize questionnaire-based person trip data, (b) compare the errors between questionnaire and GPS data sets with respect to sex and age and (c) assess the trip behaviour in time-series. Totally, 345 individual respondents were selected through random stratification to assess person trip using a questionnaire and GPS survey for each. Conversion of trip information such as a destination from the questionnaires was conducted by using GIS. The results show that errors between the two data sets in the number of trips, total trip distance and total trip duration are 25.5%, 33.2% and 37.2%, respectively. The smaller errors are found among working-age females mainly employed with the project-related activities generated by foreign investment. The trip distant was yearly increased. The study concluded that visualization of questionnaire-based person trip data and integrating them to current quantitative measurements are very useful to explore historical trip changes and understand impacts from socio-economic events.

  14. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  15. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  16. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  17. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  18. Assessment of post-implantation integration of engineered tissues using fluorescence lifetime spectroscopy

    NASA Astrophysics Data System (ADS)

    Elahi, Sakib F.; Lee, Seung Y.; Lloyd, William R.; Chen, Leng-Chun; Kuo, Shiuhyang; Zhou, Ying; Kim, Hyungjin M.; Kennedy, Robert; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann

    2018-02-01

    Clinical translation of engineered tissue constructs requires noninvasive methods to assess construct health and viability after implantation in patients. However, current practices to monitor post-implantation construct integration are either qualitative (visual assessment) or destructive (tissue histology). As label-free fluorescence lifetime sensing can noninvasively characterize pre-implantation construct viability, we employed a handheld fluorescence lifetime spectroscopy probe to quantitatively and noninvasively assess tissue constructs that were implanted in a murine model. We designed the system to be suitable for intravital measurements: portability, localization with precise maneuverability, and rapid data acquisition. Our model tissue constructs were manufactured from primary human cells to simulate patient variability and were stressed to create a range of health states. Secreted amounts of three cytokines that relate to cellular viability were measured in vitro to assess pre-implantation construct health. In vivo optical sensing assessed tissue integration of constructs at one-week and three-weeks post-implantation. At one-week post-implantation, optical parameters correlated with in vitro pre-implantation secretion levels of all three cytokines (p < 0.05). This relationship was no longer seen at three-weeks post-implantation, suggesting comparable tissue integration independent of preimplantation health. Histology confirmed re-epithelialization of these constructs independent of pre-implantation health state, supporting the lack of a correlation. These results suggest that clinical optical diagnostic tools based on label-free fluorescence lifetime sensing of endogenous tissue fluorophores could noninvasively monitor post-implantation integration of engineered tissues.

  19. Multiplex picoliter-droplet digital PCR for quantitative assessment of DNA integrity in clinical samples.

    PubMed

    Didelot, Audrey; Kotsopoulos, Steve K; Lupo, Audrey; Pekin, Deniz; Li, Xinyu; Atochin, Ivan; Srinivasan, Preethi; Zhong, Qun; Olson, Jeff; Link, Darren R; Laurent-Puig, Pierre; Blons, Hélène; Hutchison, J Brian; Taly, Valerie

    2013-05-01

    Assessment of DNA integrity and quantity remains a bottleneck for high-throughput molecular genotyping technologies, including next-generation sequencing. In particular, DNA extracted from paraffin-embedded tissues, a major potential source of tumor DNA, varies widely in quality, leading to unpredictable sequencing data. We describe a picoliter droplet-based digital PCR method that enables simultaneous detection of DNA integrity and the quantity of amplifiable DNA. Using a multiplex assay, we detected 4 different target lengths (78, 159, 197, and 550 bp). Assays were validated with human genomic DNA fragmented to sizes of 170 bp to 3000 bp. The technique was validated with DNA quantities as low as 1 ng. We evaluated 12 DNA samples extracted from paraffin-embedded lung adenocarcinoma tissues. One sample contained no amplifiable DNA. The fractions of amplifiable DNA for the 11 other samples were between 0.05% and 10.1% for 78-bp fragments and ≤1% for longer fragments. Four samples were chosen for enrichment and next-generation sequencing. The quality of the sequencing data was in agreement with the results of the DNA-integrity test. Specifically, DNA with low integrity yielded sequencing results with lower levels of coverage and uniformity and had higher levels of false-positive variants. The development of DNA-quality assays will enable researchers to downselect samples or process more DNA to achieve reliable genome sequencing with the highest possible efficiency of cost and effort, as well as minimize the waste of precious samples. © 2013 American Association for Clinical Chemistry.

  20. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk

  2. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    PubMed

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  3. Quantitative Microbial Risk Assessment and Infectious Disease Transmission Modeling of Waterborne Enteric Pathogens.

    PubMed

    Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S

    2018-04-20

    Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.

  4. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  5. AN ASSESSMENT OF INTEGRATED RISK ASSESSMENT (Journal Article)

    EPA Science Inventory

    In order to promote international understanding and acceptance of the integrated risk assessment process, the WHO/IPCS, in collaboration with the U.S. EPA and the OECD, initiated a number of activities related to integrated risk assessment. In this project, WHO/IPCS defines inte...

  6. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    PubMed Central

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  7. Microscope-integrated quantitative analysis of intraoperative indocyanine green fluorescence angiography for blood flow assessment: first experience in 30 patients.

    PubMed

    Kamp, Marcel A; Slotty, Philipp; Turowski, Bernd; Etminan, Nima; Steiger, Hans-Jakob; Hänggi, Daniel; Stummer, Walter

    2012-03-01

    Intraoperative measurements of cerebral blood flow are of interest during vascular neurosurgery. Near-infrared indocyanine green (ICG) fluorescence angiography was introduced for visualizing vessel patency intraoperatively. However, quantitative information has not been available. To report our experience with a microscope with an integrated dynamic ICG fluorescence analysis system supplying semiquantitative information on blood flow. We recorded ICG fluorescence curves of cortex and cerebral vessels using software integrated into the surgical microscope (Flow 800 software; Zeiss Pentero) in 30 patients undergoing surgery for different pathologies. The following hemodynamic parameters were assessed: maximum intensity, rise time, time to peak, time to half-maximal fluorescence, cerebral blood flow index, and transit times from arteries to cortex. For patients without obvious perfusion deficit, maximum fluorescence intensity was 177.7 arbitrary intensity units (AIs; 5-mg ICG bolus), mean rise time was 5.2 seconds (range, 2.9-8.2 seconds; SD, 1.3 seconds), mean time to peak was 9.4 seconds (range, 4.9-15.2 seconds; SD, 2.5 seconds), mean cerebral blood flow index was 38.6 AI/s (range, 13.5-180.6 AI/s; SD, 36.9 seconds), and mean transit time was 1.5 seconds (range, 360 milliseconds-3 seconds; SD, 0.73 seconds). For 3 patients with impaired cerebral perfusion, time to peak, rise time, and transit time between arteries and cortex were markedly prolonged (>20, >9 , and >5 seconds). In single patients, the degree of perfusion impairment could be quantified by the cerebral blood flow index ratios between normal and ischemic tissue. Transit times also reflected blood flow perturbations in arteriovenous fistulas. Quantification of ICG-based fluorescence angiography appears to be useful for intraoperative monitoring of arterial patency and regional cerebral blood flow.

  8. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  9. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  10. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  11. AN INTEGRATED PERSPECTIVE ON THE ASSESSMENT OF TECHNOLOGIES: INTEGRATE-HTA.

    PubMed

    Wahlster, Philip; Brereton, Louise; Burns, Jacob; Hofmann, Björn; Mozygemba, Kati; Oortwijn, Wija; Pfadenhauer, Lisa; Polus, Stephanie; Rehfuess, Eva; Schilling, Imke; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2017-01-01

    Current health technology assessment (HTA) is not well equipped to assess complex technologies as insufficient attention is being paid to the diversity in patient characteristics and preferences, context, and implementation. Strategies to integrate these and several other aspects, such as ethical considerations, in a comprehensive assessment are missing. The aim of the European research project INTEGRATE-HTA was to develop a model for an integrated HTA of complex technologies. A multi-method, four-stage approach guided the development of the INTEGRATE-HTA Model: (i) definition of the different dimensions of information to be integrated, (ii) literature review of existing methods for integration, (iii) adjustment of concepts and methods for assessing distinct aspects of complex technologies in the frame of an integrated process, and (iv) application of the model in a case study and subsequent revisions. The INTEGRATE-HTA Model consists of five steps, each involving stakeholders: (i) definition of the technology and the objective of the HTA; (ii) development of a logic model to provide a structured overview of the technology and the system in which it is embedded; (iii) evidence assessment on effectiveness, economic, ethical, legal, and socio-cultural aspects, taking variability of participants, context, implementation issues, and their interactions into account; (iv) populating the logic model with the data generated in step 3; (v) structured process of decision-making. The INTEGRATE-HTA Model provides a structured process for integrated HTAs of complex technologies. Stakeholder involvement in all steps is essential as a means of ensuring relevance and meaningful interpretation of the evidence.

  12. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  13. Quantitative assessment of airborne exposures generated during common cleaning tasks: a pilot study

    PubMed Central

    2010-01-01

    Background A growing body of epidemiologic evidence suggests an association between exposure to cleaning products with asthma and other respiratory disorders. Thus far, these studies have conducted only limited quantitative exposure assessments. Exposures from cleaning products are difficult to measure because they are complex mixtures of chemicals with a range of physicochemical properties, thus requiring multiple measurement techniques. We conducted a pilot exposure assessment study to identify methods for assessing short term, task-based airborne exposures and to quantitatively evaluate airborne exposures associated with cleaning tasks simulated under controlled work environment conditions. Methods Sink, mirror, and toilet bowl cleaning tasks were simulated in a large ventilated bathroom and a small unventilated bathroom using a general purpose, a glass, and a bathroom cleaner. All tasks were performed for 10 minutes. Airborne total volatile organic compounds (TVOC) generated during the tasks were measured using a direct reading instrument (DRI) with a photo ionization detector. Volatile organic ingredients of the cleaning mixtures were assessed utilizing an integrated sampling and analytic method, EPA TO-17. Ammonia air concentrations were also measured with an electrochemical sensor embedded in the DRI. Results Average TVOC concentrations calculated for 10 minute tasks ranged 0.02 - 6.49 ppm and the highest peak concentrations observed ranged 0.14-11 ppm. TVOC time concentration profiles indicated that exposures above background level remained present for about 20 minutes after cessation of the tasks. Among several targeted VOC compounds from cleaning mixtures, only 2-BE was detectable with the EPA method. The ten minute average 2- BE concentrations ranged 0.30 -21 ppm between tasks. The DRI underestimated 2-BE exposures compared to the results from the integrated method. The highest concentration of ammonia of 2.8 ppm occurred during mirror cleaning

  14. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    NASA Astrophysics Data System (ADS)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  15. Quantitative Microbial Risk Assessment of Pharmaceutical Products.

    PubMed

    Eissa, Mostafa Essam

    2017-01-01

    Monitoring of microbiological quality in the pharmaceutical industry is an important criterion that is required to justify safe product release to the drug market. Good manufacturing practice and efficient control on bioburden level of product components are critical parameters that influence the microbiological cleanliness of medicinal products. However, because microbial dispersion through the samples follows Poisson distribution, the rate of detection of microbiologically defective samples lambda (λ) decreases when the number of defective units per batch decreases. When integrating a dose-response model of infection (P inf ) of a specific objectionable microbe with a contamination module, the overall probability of infection from a single batch of pharmaceutical product can be estimated. The combination of P inf with detectability chance of the test (P det ) will yield a value that could be used as a quantitative measure of the possibility of passing contaminated batch units of product with a certain load of a specific pathogen and infecting the final consumer without being detected in the firm. The simulation study can be used to assess the risk of contamination and infection from objectionable microorganisms for sterile and non-sterile products. LAY ABSTRACT: Microbial contamination of pharmaceutical products is a global problem that may lead to infection and possibly death. While reputable pharmaceutical companies strive to deliver microbiologically safe products, it would be helpful to apply an assessment system for the current risk associated with pharmaceutical batches delivered to the drug market. The current methodology may be helpful also in determining the degree of improvement or deterioration on the batch processing flow until reaching the final consumer. Moreover, the present system is flexible and can be applied to other industries such as food, cosmetics, or medical devices manufacturing and processing fields to assess the microbiological risk of

  16. Quantitative assessment of 12-lead ECG synthesis using CAVIAR.

    PubMed

    Scherer, J A; Rubel, P; Fayn, J; Willems, J L

    1992-01-01

    The objective of this study is to assess the performance of patient-specific segment-specific (PSSS) synthesis in QRST complexes using CAVIAR, a new method of the serial comparison for electrocardiograms and vectorcardiograms. A collection of 250 multi-lead recordings from the Common Standards for Quantitative Electrocardiography (CSE) diagnostic pilot study is employed. QRS and ST-T segments are independently synthesized using the PSSS algorithm so that the mean-squared error between the original and estimated waveforms is minimized. CAVIAR compares the recorded and synthesized QRS and ST-T segments and calculates the mean-quadratic deviation as a measure of error. The results of this study indicate that estimated QRS complexes are good representatives of their recorded counterparts, and the integrity of the spatial information is maintained by the PSSS synthesis process. Analysis of the ST-T segments suggests that the deviations between recorded and synthesized waveforms are considerably greater than those associated with the QRS complexes. The poorer performance of the ST-T segments is attributed to magnitude normalization of the spatial loops, low-voltage passages, and noise interference. Using the mean-quadratic deviation and CAVIAR as methods of performance assessment, this study indicates that the PSSS-synthesis algorithm accurately maintains the signal information within the 12-lead electrocardiogram.

  17. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the

  18. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of

  19. Can We Integrate Qualitative and Quantitative Research in Science Education?

    NASA Astrophysics Data System (ADS)

    Niaz, Mansoor

    The main objective of this paper is to emphasize the importance of integrating qualitative and quantitative research methodologies in science education. It is argued that the Kuhnian in commensurability thesis (a major source of inspiration for qualitative researchers) represents an obstacle for this integration. A major thesis of the paper is that qualitative researchers have interpreted the increased popularity of their paradigm (research programme) as a revolutionary break through in the Kuhnian sense. A review of the literature in areas relevant to science education shows that researchers are far from advocating qualitative research as the only methodology. It is concluded that competition between divergent approaches to research in science education (cf. Lakatos, 1970) would provide a better forum for a productive sharing of research experiences.

  20. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  1. System integration of wind and solar power in integrated assessment models: A cross-model evaluation of new approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel

    Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less

  2. Integrated Science Assessments

    EPA Pesticide Factsheets

    Integrated Science Assessments are reports that represent a concise evaluation and synthesis of the most policy-relevant science for reviewing the National Ambient Air Quality Standards for the six principal pollutants.

  3. Integrating a quantitative risk appraisal in a health impact assessment: analysis of the novel smoke-free policy in Hungary.

    PubMed

    Ádám, Balázs; Molnár, Ágnes; Gulis, Gabriel; Ádány, Róza

    2013-04-01

    Although the quantification of health outcomes in a health impact assessment (HIA) is scarce in practice, it is preferred by policymakers, as it assists various aspects of the decision-making process. This article provides an example of integrating a quantitative risk appraisal in an HIA performed for the recently adopted Hungarian anti-smoking policy which introduced a smoking ban in closed public places, workplaces and public transport vehicles, and is one of the most effective measures to decrease smoking-related ill health. A comprehensive, prospective HIA was conducted to map the full impact chain of the proposal. Causal pathways were prioritized in a transparent process with special attention given to those pathways for which measures of disease burden could be calculated for the baseline and predicted future scenarios. The proposal was found to decrease the prevalence of active and passive smoking and result in a considerably positive effect on several diseases, among which lung cancer, chronic pulmonary diseases, coronary heart diseases and stroke have the greatest importance. The health gain calculated for the quantifiable health outcomes is close to 1700 deaths postponed and 16,000 life years saved annually in Hungary. The provision of smoke-free public places has an unambiguously positive impact on the health of the public, especially in a country with a high burden of smoking-related diseases. The study described offers a practical example of applying quantification in an HIA, thereby promoting its incorporation into political decision making.

  4. A novel integrated assessment methodology of urban water reuse.

    PubMed

    Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S

    2011-01-01

    Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.

  5. Integrated farm sustainability assessment for the environmental management of rural activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stachetii Rodrigues, Geraldo, E-mail: stacheti@cnpma.embrapa.b; Aparecida Rodrigues, Izilda, E-mail: isis@cnpma.embrapa.b; Almeida Buschinelli, Claudio Cesar de, E-mail: buschi@cnpma.embrapa.b

    2010-07-15

    Farmers have been increasingly called upon to respond to an ongoing redefinition in consumers' demands, having as a converging theme the search for sustainable production practices. In order to satisfy this objective, instruments for the environmental management of agricultural activities have been sought out. Environmental impact assessment methods are appropriate tools to address the choice of technologies and management practices to minimize negative effects of agricultural development, while maximizing productive efficiency, sound usage of natural resources, conservation of ecological assets and equitable access to wealth generation means. The 'system for weighted environmental impact assessment of rural activities' (APOIA-NovoRural) presented inmore » this paper is organized to provide integrated farm sustainability assessment according to quantitative environmental standards and defined socio-economic benchmarks. The system integrates sixty-two objective indicators in five sustainability dimensions - (i) Landscape ecology, (ii) Environmental quality (atmosphere, water and soil), (iii) Sociocultural values, (iv) Economic values, and (v) Management and administration. Impact indices are expressed in three integration levels: (i) specific indicators, that offer a diagnostic and managerial tool for farmers and rural administrators, by pointing out particular attributes of the rural activities that may be failing to comply with defined environmental performance objectives; (ii) integrated sustainability dimensions, that show decision-makers the major contributions of the rural activities toward local sustainable development, facilitating the definition of control actions and promotion measures; and (iii) aggregated sustainability index, that can be considered a yardstick for eco-certification purposes. Nine fully documented case studies carried out with the APOIA-NovoRural system, focusing on different scales, diverse rural activities/farming systems, and

  6. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  7. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    ERIC Educational Resources Information Center

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  8. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  9. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  10. Quantifying biological integrity by taxonomic completeness: its utility in regional and global assessments.

    PubMed

    Hawkins, Charles P

    2006-08-01

    Water resources managers and conservation biologists need reliable, quantitative, and directly comparable methods for assessing the biological integrity of the world's aquatic ecosystems. Large-scale assessments are constrained by the lack of consistency in the indicators used to assess biological integrity and our current inability to translate between indicators. In theory, assessments based on estimates of taxonomic completeness, i.e., the proportion of expected taxa that were observed (observed/expected, O/E) are directly comparable to one another and should therefore allow regionally and globally consistent summaries of the biological integrity of freshwater ecosystems. However, we know little about the true comparability of O/E assessments derived from different data sets or how well O/E assessments perform relative to other indicators in use. I compared the performance (precision, bias, and sensitivity to stressors) of O/E assessments based on five different data sets with the performance of the indicators previously applied to these data (three multimetric indices, a biotic index, and a hybrid method used by the state of Maine). Analyses were based on data collected from U.S. stream ecosystems in North Carolina, the Mid-Atlantic Highlands, Maine, and Ohio. O/E assessments resulted in very similar estimates of mean regional conditions compared with most other indicators once these indicators' values were standardized relative to reference-site means. However, other indicators tended to be biased estimators of O/E, a consequence of differences in their response to natural environmental gradients and sensitivity to stressors. These results imply that, in some cases, it may be possible to compare assessments derived from different indicators by standardizing their values (a statistical approach to data harmonization). In situations where it is difficult to standardize or otherwise harmonize two or more indicators, O/E values can easily be derived from existing

  11. Critical Assessment Issues in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Ferns, Sonia; Zegwaard, Karsten E.

    2014-01-01

    Assessment has long been a contentious issue in work-integrated learning (WIL) and cooperative education. Despite assessment being central to the integrity and accountability of a university and long-standing theories around best practice in assessment, enacting quality assessment practices has proven to be more difficult. Authors in this special…

  12. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  13. Enhancing quantitative approaches for assessing community resilience

    USGS Publications Warehouse

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  14. Enhancing quantitative approaches for assessing community resilience.

    PubMed

    Chuang, W C; Garmestani, A; Eason, T N; Spanbauer, T L; Fried-Petersen, H B; Roberts, C P; Sundstrom, S M; Burnett, J L; Angeler, D G; Chaffin, B C; Gunderson, L; Twidwell, D; Allen, C R

    2018-05-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems. Published by Elsevier Ltd.

  15. Noninvasive Assessment of Biochemical and Mechanical Properties of Lumbar Discs Through Quantitative Magnetic Resonance Imaging in Asymptomatic Volunteers.

    PubMed

    Foltz, Mary H; Kage, Craig C; Johnson, Casey P; Ellingson, Arin M

    2017-11-01

    Intervertebral disc degeneration is a prevalent phenomenon associated with back pain. It is of critical clinical interest to discriminate disc health and identify early stages of degeneration. Traditional clinical T2-weighted magnetic resonance imaging (MRI), assessed using the Pfirrmann classification system, is subjective and fails to adequately capture initial degenerative changes. Emerging quantitative MRI techniques offer a solution. Specifically, T2* mapping images water mobility in the macromolecular network, and our preliminary ex vivo work shows high predictability of the disc's glycosaminoglycan content (s-GAG) and residual mechanics. The present study expands upon this work to predict the biochemical and biomechanical properties in vivo and assess their relationship with both age and Pfirrmann grade. Eleven asymptomatic subjects (range: 18-62 yrs) were enrolled and imaged using a 3T MRI scanner. T2-weighted images (Pfirrmann grade) and quantitative T2* maps (predict s-GAG and residual stress) were acquired. Surface maps based on the distribution of these properties were generated and integrated to quantify the surface volume. Correlational analyses were conducted to establish the relationship between each metric of disc health derived from the quantitative T2* maps with both age and Pfirrmann grade, where an inverse trend was observed. Furthermore, the nucleus pulposus (NP) signal in conjunction with volumetric surface maps provided the ability to discern differences during initial stages of disc degeneration. This study highlights the ability of T2* mapping to noninvasively assess the s-GAG content, residual stress, and distributions throughout the entire disc, which may provide a powerful diagnostic tool for disc health assessment.

  16. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  17. 78 FR 38318 - Integrated Science Assessment for Lead

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9827-4] Integrated Science Assessment for Lead AGENCY... availability of a final document titled, ``Integrated Science Assessment for Lead'' (EPA/600/R-10/075F). The... ``Integrated Science Assessment for Lead'' will be made available primarily through the Internet on the NCEA...

  18. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  19. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    PubMed

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  20. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, Michael; Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data

  1. The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health

    DTIC Science & Technology

    2016-10-01

    AWARD NUMBER: W81XWH-15-1-0669 TITLE: The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health PRINCIPAL INVESTIGATOR...3. DATES COVERED 30 Sep 2015 - 29 Sep 2016 4. TITLE AND SUBTITLE The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health 5a...amputation and subsequently evaluate the utility of non-invasive imaging for evaluating the impact of next-generation socket technologies on the health of

  2. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  4. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.

  5. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  6. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.

  7. Integrating Quantitative Reasoning into STEM Courses Using an Energy and Environment Context

    NASA Astrophysics Data System (ADS)

    Myers, J. D.; Lyford, M. E.; Mayes, R. L.

    2010-12-01

    Many secondary and post-secondary science classes do not integrate math into their curriculum, while math classes commonly teach concepts without meaningful context. Consequently, students lack basic quantitative skills and the ability to apply them in real-world contexts. For the past three years, a Wyoming Department of Education funded Math Science Partnership at the University of Wyoming (UW) has brought together middle and high school science and math teachers to model how math and science can be taught together in a meaningful way. The UW QR-STEM project emphasizes the importance of Quantitative Reasoning (QR) to student success in Science, Technology, Engineering and Mathematics (STEM). To provide a social context, QR-STEM has focused on energy and the environment. In particular, the project has examined how QR and STEM concepts play critical roles in many of the current global challenges of energy and environment. During four 3-day workshops each summer and over several virtual and short face-to-face meetings during the academic year, UW and community college science and math faculty work with math and science teachers from middle and high schools across the state to improve QR instruction in math and science classes. During the summer workshops, faculty from chemistry, physics, earth sciences, biology and math lead sessions to: 1) improve the basic science content knowledge of teachers; 2) improve teacher understanding of math and statistical concepts, 3) model how QR can be taught by engaging teachers in sessions that integrate math and science in an energy and environment context; and 4) focus curricula using Understanding by Design to identify enduring understandings on which to center instructional strategies and assessment. In addition to presenting content, faculty work with teachers as they develop classroom lessons and larger units to be implemented during the school year. Teachers form interdisciplinary groups which often consist of math and

  8. Filling the knowledge gap: Integrating quantitative genetics and genomics in graduate education and outreach

    USDA-ARS?s Scientific Manuscript database

    The genomics revolution provides vital tools to address global food security. Yet to be incorporated into livestock breeding, molecular techniques need to be integrated into a quantitative genetics framework. Within the U.S., with shrinking faculty numbers with the requisite skills, the capacity to ...

  9. Assessment of vulnerability in karst aquifers using a quantitative integrated numerical model: catchment characterization and high resolution monitoring - Application to semi-arid regions- Lebanon.

    NASA Astrophysics Data System (ADS)

    Doummar, Joanna; Aoun, Michel; Andari, Fouad

    2016-04-01

    Karst aquifers are highly heterogeneous and characterized by a duality of recharge (concentrated; fast versus diffuse; slow) and a duality of flow which directly influences groundwater flow and spring responses. Given this heterogeneity in flow and infiltration, karst aquifers do not always obey standard hydraulic laws. Therefore the assessment of their vulnerability reveals to be challenging. Studies have shown that vulnerability of aquifers is highly governed by recharge to groundwater. On the other hand specific parameters appear to play a major role in the spatial and temporal distribution of infiltration on a karst system, thus greatly influencing the discharge rates observed at a karst spring, and consequently the vulnerability of a spring. This heterogeneity can only be depicted using an integrated numerical model to quantify recharge spatially and assess the spatial and temporal vulnerability of a catchment for contamination. In the framework of a three-year PEER NSF/USAID funded project, the vulnerability of a karst catchment in Lebanon is assessed quantitatively using a numerical approach. The aim of the project is also to refine actual evapotranspiration rates and spatial recharge distribution in a semi arid environment. For this purpose, a monitoring network was installed since July 2014 on two different pilot karst catchment (drained by Qachqouch Spring and Assal Spring) to collect high resolution data to be used in an integrated catchment numerical model with MIKE SHE, DHI including climate, unsaturated zone, and saturated zone. Catchment characterization essential for the model included geological mapping and karst features (e.g., dolines) survey as they contribute to fast flow. Tracer experiments were performed under different flow conditions (snow melt and low flow) to delineate the catchment area, reveal groundwater velocities and response to snowmelt events. An assessment of spring response after precipitation events allowed the estimation of the

  10. Integrated Exposure Assessment Monitoring.

    ERIC Educational Resources Information Center

    Behar, Joseph V.; And Others

    1979-01-01

    Integrated Exposure Assessment Monitoring is the coordination of environmental (air, water, land, and crops) monitoring networks to collect systematically pollutant exposure data for a specific receptor, usually man. (Author/BB)

  11. Integrated approach for confidence-enhanced quantitative analysis of herbal medicines, Cistanche salsa as a case.

    PubMed

    Liu, Wenjing; Song, Qingqing; Yan, Yu; Liu, Yao; Li, Peng; Wang, Yitao; Tu, Pengfei; Song, Yuelin; Li, Jun

    2018-08-03

    Although far away from perfect, it is practical to assess the quality of a given herbal medicine (HM) through simultaneous determination of a panel of components. However, the confidences of the quantitative outcomes from LC-MS/MS platform risk several technical barriers, such as chemical degradation, polarity range, concentration span, and identity misrecognition. Herein, we made an attempt to circumvent these obstacles by integrating several fit-for-purpose techniques, including online extraction (OLE), serially coupled reversed phase LC-hydrophilic interaction liquid chromatography (RPLC-HILIC), tailored multiple reaction monitoring (MRM), and relative response vs. collision energy curve (RRCEC) matching. Confidence-enhanced quantitative analysis of Cistanche salsa (Csa), a well-known psammophytic species and tonic herbal medicine, was conducted as a proof-of-concept. OLE module was deployed to prohibit chemical degradation, in particular E/Z-configuration transformation for phenylethanoid glycosides. Satisfactory retention took place for each analyte regardless of polarity because of successive passing through RPLC and HILIC columns. Optimum parameters for the minor components, at the meanwhile of inferior ones for the abundant ingredients, ensured the locations of all contents in the linear ranges. The unequivocal assignment of the captured signals was achieved by matching retention times, ion transitions, and more importantly, RRCECs between authentic compounds and suspect peaks. Diverse validation assays demonstrated the newly developed method to be reliable. Particularly, the distribution of mannitol rather than galactitol was disclosed although these isomers showed identical retention time and ion transitions. The contents of 21 compounds-of-interest were definitively determined in Csa as well as two analogous species, and the quantitative patterns exerted great variations among not only different species but different Csa samples. Together, the

  12. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  13. Quantitative assessment of desertification in south of Iran using MEDALUS method.

    PubMed

    Sepehr, A; Hassanli, A M; Ekhtesasi, M R; Jamali, J B

    2007-11-01

    The main aim of this study was the quantitative assessment of desertification process in the case study area of the Fidoye-Garmosht plain (Southern Iran). Based on the MEDALUS approach and the characteristics of study area a regional model developed using GIS. Six main factors or indicators of desertification including: soil, climate, erosion, plant cover, groundwater and management were considered for evaluation. Then several sub-indicators affecting the quality of each main indicator were identified. Based on the MEDALUS approach, each sub-indicator was quantified according to its quality and given a weighting of between 1.0 and 2.0. ArcGIS 9 was used to analyze and prepare the layers of quality maps using the geometric mean to integrate the individual sub-indicator maps. In turn the geometric mean of all six quality maps was used to generate a single desertification status map. Results showed that 12% of the area is classified as very severe, 81% as severe and 7% as moderately affected by desertification. In addition the plant cover and groundwater indicators were the most important factors affecting desertification process in the study area. The model developed may be used to assess desertification process and distinguish the areas sensitive to desertification in the study region and in regions with the similar characteristics.

  14. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  15. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  16. An approach for integrating toxicogenomic data in risk assessment: The dibutyl phthalate case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Euling, Susan Y., E-mail: euling.susan@epa.gov; Thompson, Chad M.; Chiu, Weihsueh A.

    An approach for evaluating and integrating genomic data in chemical risk assessment was developed based on the lessons learned from performing a case study for the chemical dibutyl phthalate. A case study prototype approach was first developed in accordance with EPA guidance and recommendations of the scientific community. Dibutyl phthalate (DBP) was selected for the case study exercise. The scoping phase of the dibutyl phthalate case study was conducted by considering the available DBP genomic data, taken together with the entire data set, for whether they could inform various risk assessment aspects, such as toxicodynamics, toxicokinetics, and dose–response. A descriptionmore » of weighing the available dibutyl phthalate data set for utility in risk assessment provides an example for considering genomic data for future chemical assessments. As a result of conducting the scoping process, two questions—Do the DBP toxicogenomic data inform 1) the mechanisms or modes of action?, and 2) the interspecies differences in toxicodynamics?—were selected to focus the case study exercise. Principles of the general approach include considering the genomics data in conjunction with all other data to determine their ability to inform the various qualitative and/or quantitative aspects of risk assessment, and evaluating the relationship between the available genomic and toxicity outcome data with respect to study comparability and phenotypic anchoring. Based on experience from the DBP case study, recommendations and a general approach for integrating genomic data in chemical assessment were developed to advance the broader effort to utilize 21st century data in risk assessment. - Highlights: • Performed DBP case study for integrating genomic data in risk assessment • Present approach for considering genomic data in chemical risk assessment • Present recommendations for use of genomic data in chemical risk assessment.« less

  17. Combined visual and semi-quantitative assessment of 123I-FP-CIT SPECT for the diagnosis of dopaminergic neurodegenerative diseases.

    PubMed

    Ueda, Jun; Yoshimura, Hajime; Shimizu, Keiji; Hino, Megumu; Kohara, Nobuo

    2017-07-01

    Visual and semi-quantitative assessments of 123 I-FP-CIT single-photon emission computed tomography (SPECT) are useful for the diagnosis of dopaminergic neurodegenerative diseases (dNDD), including Parkinson's disease, dementia with Lewy bodies, progressive supranuclear palsy, multiple system atrophy, and corticobasal degeneration. However, the diagnostic value of combined visual and semi-quantitative assessment in dNDD remains unclear. Among 239 consecutive patients with a newly diagnosed possible parkinsonian syndrome who underwent 123 I-FP-CIT SPECT in our medical center, 114 patients with a disease duration less than 7 years were diagnosed as dNDD with the established criteria or as non-dNDD according to clinical judgment. We retrospectively examined their clinical characteristics and visual and semi-quantitative assessments of 123 I-FP-CIT SPECT. The striatal binding ratio (SBR) was used as a semi-quantitative measure of 123 I-FP-CIT SPECT. We calculated the sensitivity and specificity of visual assessment alone, semi-quantitative assessment alone, and combined visual and semi-quantitative assessment for the diagnosis of dNDD. SBR was correlated with visual assessment. Some dNDD patients with a normal visual assessment had an abnormal SBR, and vice versa. There was no statistically significant difference between sensitivity of the diagnosis with visual assessment alone and semi-quantitative assessment alone (91.2 vs. 86.8%, respectively, p = 0.29). Combined visual and semi-quantitative assessment demonstrated superior sensitivity (96.7%) to visual assessment (p = 0.03) or semi-quantitative assessment (p = 0.003) alone with equal specificity. Visual and semi-quantitative assessments of 123 I-FP-CIT SPECT are helpful for the diagnosis of dNDD, and combined visual and semi-quantitative assessment shows superior sensitivity with equal specificity.

  18. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    PubMed

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-11-01

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The

  20. Quantitative Measurement of Integrated Band Intensities of Isoprene and Formaldehyde

    NASA Astrophysics Data System (ADS)

    Brauer, Carolyn S.; Johnson, Timothy J.; Blake, Thomas A.; Sams, Robert L.

    2013-06-01

    The OH-initiated oxidation of isoprene, which is one of the primary volatile organic compounds produced by vegetation, is a major source of atmospheric formaldehyde and other oxygenated organics. Both molecules are also known products of biomass burning. Absorption coefficients and integrated band intensities for isoprene and formaldehyde are reported in the 600 - 6500 cm^{-1} region. The pressure broadened (1 atmosphere N_2) spectra were recorded at 278, 298 and 323 K in a 19.96 cm path length cell at 0.112 cm^{-1} resolution, using a Bruker 66V FTIR. Composite spectra are composed of a minimum of seven pressures at each temperature for both molecules. These data are part of the PNNL Spectral Database, which contains quantitative spectra of over 600 molecules. These quantitative spectra facilitate atmospheric monitoring for both remote and in situ sensing and such applications will be discussed. Timothy J. Johnson, Luisa T. M. Profeta, Robert L. Sams, David W. T. Griffith, Robert L. Yokelson Vibrational Spectroscopy {53}(1);97-102 (2010).

  1. The Quantitative Reasoning for College Science (QuaRCS) Assessment: Emerging Themes from 5 Years of Data

    NASA Astrophysics Data System (ADS)

    Follette, Katherine; Dokter, Erin; Buxner, Sanlyn

    2018-01-01

    The Quantitative Reasoning for College Science (QuaRCS) Assessment is a validated assessment instrument that was designed to measure changes in students' quantitative reasoning skills, attitudes toward mathematics, and ability to accurately assess their own quantitative abilities. It has been administered to more than 5,000 students at a variety of institutions at the start and end of a semester of general education college science instruction. I will begin by briefly summarizing our published work surrounding validation of the instrument and identification of underlying attitudinal factors (composite variables identified via factor analysis) that predict 50% of the variation in students' scores on the assessment. I will then discuss more recent unpublished work, including: (1) Development and validation of an abbreviated version of the assessment (The QuaRCS Light), which results in marked improvements in students' ability to maintain a high effort level throughout the assessment and has broad implications for quantitative reasoning assessments in general, and (2) Our efforts to revise the attitudinal portion of the assessment to better assess math anxiety level, another key factor in student performance on numerical assessments.

  2. Global and local health burden trade-off through the hybridisation of quantitative microbial risk assessment and life cycle assessment to aid water management.

    PubMed

    Kobayashi, Yumi; Peters, Greg M; Ashbolt, Nicholas J; Heimersson, Sara; Svanström, Magdalena; Khan, Stuart J

    2015-08-01

    Life cycle assessment (LCA) and quantitative risk assessment (QRA) are commonly used to evaluate potential human health impacts associated with proposed or existing infrastructure and products. Each approach has a distinct objective and, consequently, their conclusions may be inconsistent or contradictory. It is proposed that the integration of elements of QRA and LCA may provide a more holistic approach to health impact assessment. Here we examine the possibility of merging LCA assessed human health impacts with quantitative microbial risk assessment (QMRA) for waterborne pathogen impacts, expressed with the common health metric, disability adjusted life years (DALYs). The example of a recent large-scale water recycling project in Sydney, Australia was used to identify and demonstrate the potential advantages and current limitations of this approach. A comparative analysis of two scenarios - with and without the development of this project - was undertaken for this purpose. LCA and QMRA were carried out independently for the two scenarios to compare human health impacts, as measured by DALYs lost per year. LCA results suggested that construction of the project would lead to an increased number of DALYs lost per year, while estimated disease burden resulting from microbial exposures indicated that it would result in the loss of fewer DALYs per year than the alternative scenario. By merging the results of the LCA and QMRA, we demonstrate the advantages in providing a more comprehensive assessment of human disease burden for the two scenarios, in particular, the importance of considering the results of both LCA and QRA in a comparative assessment of decision alternatives to avoid problem shifting. The application of DALYs as a common measure between the two approaches was found to be useful for this purpose. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Assessment of Integrated Nozzle Performance

    NASA Technical Reports Server (NTRS)

    Lambert, H. H.; Mizukami, M.

    1999-01-01

    This presentation highlights the activities that researchers at the NASA Lewis Research Center (LeRC) have been and will be involved in to assess integrated nozzle performance. Three different test activities are discussed. First, the results of the Propulsion Airframe Integration for High Speed Research 1 (PAIHSR1) study are presented. The PAIHSR1 experiment was conducted in the LeRC 9 ft x l5 ft wind tunnel from December 1991 to January 1992. Second, an overview of the proposed Mixer/ejector Inlet Distortion Study (MIDIS-E) is presented. The objective of MIDIS-E is to assess the effects of applying discrete disturbances to the ejector inlet flow on the acoustic and aero-performance of a mixer/ejector nozzle. Finally, an overview of the High-Lift Engine Aero-acoustic Technology (HEAT) test is presented. The HEAT test is a cooperative effort between the propulsion system and high-lift device research communities to assess wing/nozzle integration effects. The experiment is scheduled for FY94 in the NASA Ames Research Center (ARC) 40 ft x 80 ft Low Speed Wind Tunnel (LSWT).

  4. Adapting the Quebecois method for assessing implementation to the French National Alzheimer Plan 2008–2012: lessons for gerontological services integration

    PubMed Central

    Somme, Dominique; Trouvé, Hélène; Perisset, Catherine; Corvol, Aline; Ankri, Joël; Saint-Jean, Olivier; de Stampa, Matthieu

    2014-01-01

    Introduction Many countries face ageing-related demographic and epidemiological challenges, notably neurodegenerative disorders, due to the multiple care services they require, thereby pleading for a more integrated system of care. The integrated Quebecois method issued from the Programme of Research to Integrate Services for the Maintenance of Autonomy inspired a French pilot experiment and the National Alzheimer Plan 2008–2012. Programme of Research to Integrate Services for the Maintenance of Autonomy method implementation was rated with an evaluation grid adapted to assess its successive degrees of completion. Discussion The approaching end of the president's term led to the method's institutionalization (2011–2012), before the implementation study ended. When the government changed, the study was interrupted. The results extracted from that ‘lost’ study (presented herein) have, nonetheless, ‘found’ some key lessons. Key lessons/conclusion It was possible to implement a Quebecois integrated-care method in France. We describe the lessons and pitfalls encountered in adapting this evaluation tool. This process is necessarily multidisciplinary and requires a test phase. A simple tool for quantitative assessment of integration was obtained. The first assessment of the tool was unsatisfactory but requires further studies. In the meantime, we recommend using mixed methodologies to assess the services integration level. PMID:24959112

  5. Preamble to the Integrated Science Assessments (ISA)

    EPA Science Inventory

    The Preamble to the Integrated Science Assessments, or "Preamble", is an overview document outlining the basic steps and criteria used in developing the Integrated Science Assessments (ISA). Previously included as part of the ISA, it will now be referenced by each ISA as...

  6. The Integrated Performance Assessment (IPA): Connecting Assessment to Instruction and Learning

    ERIC Educational Resources Information Center

    Adair-Hauck, Bonnie; Glisan, Eileen W.; Koda, Keiko; Swender, Elvira B.; Sandrock, Paul

    2006-01-01

    This article reports on "Beyond the OPI: Integrated Performance Assessment (IPA) Design Project," a three-year (1997-2000) research initiative sponsored by the U.S. Department of Education International Research and Studies Program. The primary goal of the project was to develop an integrated skills assessment prototype that would measure…

  7. Methods for assessing geodiversity

    NASA Astrophysics Data System (ADS)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2017-04-01

    The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.

  8. Challenges and Opportunities for Integrating Social Science Perspectives into Climate and Global Change Assessments

    NASA Astrophysics Data System (ADS)

    Larson, E. K.; Li, J.; Zycherman, A.

    2017-12-01

    Integration of social science into climate and global change assessments is fundamental for improving understanding of the drivers, impacts and vulnerability of climate change, and the social, cultural and behavioral challenges related to climate change responses. This requires disciplinary and interdisciplinary knowledge as well as integrational and translational tools for linking this knowledge with the natural and physical sciences. The USGCRP's Social Science Coordinating Committee (SSCC) is tasked with this challenge and is working to integrate relevant social, economic and behavioral knowledge into processes like sustained assessments. This presentation will discuss outcomes from a recent SSCC workshop, "Social Science Perspectives on Climate Change" and their applications to sustained assessments. The workshop brought academic social scientists from four disciplines - anthropology, sociology, geography and archaeology - together with federal scientists and program managers to discuss three major research areas relevant to the USGCRP and climate assessments: (1) innovative tools, methods, and analyses to clarify the interactions of human and natural systems under climate change, (2) understanding of factors contributing to differences in social vulnerability between and within communities under climate change, and (3) social science perspectives on drivers of global climate change. These disciplines, collectively, emphasize the need to consider socio-cultural, political, economic, geographic, and historic factors, and their dynamic interactions, to understand climate change drivers, social vulnerability, and mitigation and adaptation responses. They also highlight the importance of mixed quantitative and qualitative methods to explain impacts, vulnerability, and responses at different time and spatial scales. This presentation will focus on major contributions of the social sciences to climate and global change research. We will discuss future directions for

  9. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  10. The integrated landscape assessment project

    Treesearch

    Miles A. Hemstrom; Janine Salwasser; Joshua Halofsky; Jimmy Kagan; Cyndi Comfort

    2012-01-01

    The Integrated Landscape Assessment Project (ILAP) is a three-year effort that produces information, models, data, and tools to help land managers, policymakers, and others examine mid- to broad-scale (e.g., watersheds to states and larger areas) prioritization of land management actions, perform landscape assessments, and estimate potential effects of management...

  11. Obstacles to the coordination of delivering integrated prenatal HIV, syphilis and hepatitis B testing services in Guangdong: using a needs assessment approach.

    PubMed

    Xia, Jianhong; Rutherford, Shannon; Ma, Yuanzhu; Wu, Li; Gao, Shuang; Chen, Tingting; Lu, Xiao; Zhang, Xiaozhuang; Chu, Cordia

    2015-03-24

    Integration of services for Prevention of Mother-To-Child Transmission of HIV (PMTCT) into routine maternal and child health care is promoted as a priority strategy by the WHO to facilitate the implementation of PMTCT. Integration of services emphasizes inter-sectoral coordination in the health systems to provide convenient services for clients. China has been integrating prenatal HIV, syphilis and hepatitis B testing services since 2009. However, as the individual health systems are complex, effective coordination among different health agencies is challenging. Few studies have examined the factors that affect the coordination of such complex systems. The aim of this study is to assess the effectiveness of and examine challenges for integrated service delivery. Findings will provide the basis for strategy development to enhance the effective delivery of integrated services. The research was conducted in Guangdong province in 2013 using a needs assessment approach that includes qualitative and quantitative methods. Quantitative data was collected through a survey and from routine monitoring for PMTCT and qualitative data was collected through stakeholder interviews. Routine monitoring data used to assess key indicators of coordination suggested numerous coordination problems. The rates of prenatal HIV (95%), syphilis (47%) and hepatitis B (47%) test were inconsistent. An average of only 20% of the HIV positive mothers was referred in the health systems. There were no regular meetings among different health agencies and the clients indicated complicated service processes. The major obstacles to the coordination of delivering these integrated services are lack of service resource integration; and lack of a mechanism for coordination of the health systems, with no uniform guidelines, clear roles or consistent evaluation. The key obstacles that have been identified in this study hinder the coordination of the delivery of integrated services. Our recommendations include

  12. Integrated work-flow for quantitative metabolome profiling of plants, Peucedani Radix as a case.

    PubMed

    Song, Yuelin; Song, Qingqing; Liu, Yao; Li, Jun; Wan, Jian-Bo; Wang, Yitao; Jiang, Yong; Tu, Pengfei

    2017-02-08

    Universal acquisition of reliable information regarding the qualitative and quantitative properties of complicated matrices is the premise for the success of metabolomics study. Liquid chromatography-mass spectrometry (LC-MS) is now serving as a workhorse for metabolomics; however, LC-MS-based non-targeted metabolomics is suffering from some shortcomings, even some cutting-edge techniques have been introduced. Aiming to tackle, to some extent, the drawbacks of the conventional approaches, such as redundant information, detector saturation, low sensitivity, and inconstant signal number among different runs, herein, a novel and flexible work-flow consisting of three progressive steps was proposed to profile in depth the quantitative metabolome of plants. The roots of Peucedanum praeruptorum Dunn (Peucedani Radix, PR) that are rich in various coumarin isomers, were employed as a case study to verify the applicability. First, offline two dimensional LC-MS was utilized for in-depth detection of metabolites in a pooled PR extract namely universal metabolome standard (UMS). Second, mass fragmentation rules, notably concerning angular-type pyranocoumarins that are the primary chemical homologues in PR, and available databases were integrated for signal assignment and structural annotation. Third, optimum collision energy (OCE) as well as ion transition for multiple monitoring reaction measurement was online optimized with a reference compound-free strategy for each annotated component and large-scale relative quantification of all annotated components was accomplished by plotting calibration curves via serially diluting UMS. It is worthwhile to highlight that the potential of OCE for isomer discrimination was described and the linearity ranges of those primary ingredients were extended by suppressing their responses. The integrated workflow is expected to be qualified as a promising pipeline to clarify the quantitative metabolome of plants because it could not only

  13. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  14. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  15. Methodological exemplar of integrating quantitative and qualitative evidence - supportive care for men with prostate cancer: what are the most important components?

    PubMed

    Huntley, Alyson L; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie; Evans, Maggie

    2017-01-01

    To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer. Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the findings have yet to be integrated. Integration of quantitative and qualitative synthesized evidence. Two previously published systematic reviews. Synthesized evidence on supportive care for men with prostate cancer was integrated from two previously published systematic reviews: a narrative quantitative review and a qualitative review with thematic synthesis. These two streams of synthesized evidence were synthesized using concurrent narrative summary. Data from both reviews were used to develop a set of propositions from which a summary of components of care that likely to contribute to supportive care acceptable to men with prostate cancer were identified. Nine propositions were developed which covered men's supportive care focusing on the role of health professionals. These propositions were used to compose nine components of care likely to lead to supportive care that is acceptable to men with prostate cancer. Some of these components are no/low cost such as developing a more empathic personalized approach, but more specific approaches need further investigation in randomized controlled trials, for example, online support. This methodological exemplar demonstrates the integration of quantitative and qualitative synthesized data to determine components of care likely to lead to provision of supportive care acceptable to men with prostate cancer. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  16. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  17. An information diffusion technique to assess integrated hazard risks.

    PubMed

    Huang, Chongfu; Huang, Yundong

    2018-02-01

    An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A Call for an Integrated Program of Assessment

    PubMed Central

    Regehr, Glenn

    2017-01-01

    An integrated curriculum that does not incorporate equally integrated assessment strategies is likely to prove ineffective in achieving the desired educational outcomes. We suggest it is time for colleges and schools of pharmacy to re-engineer their approach to assessment. To build the case, we first discuss the challenges leading to the need for curricular developments in pharmacy education. We then turn to the literature that informs how assessment can influence learning, introduce an approach to learning assessment that is being used by several medical education programs, and provide some examples of this approach in operation. Finally, we identify some of the challenges faced in adopting such an integrated approach to assessment and suggest that this is an area ripe with research opportunities for pharmacy educators. PMID:28630518

  19. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  20. 76 FR 19311 - Update of the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From Foodborne... quantitative targets established in ``Healthy People 2010.'' In 2005, FoodNet data showed 0.30 L. monocytogenes... 4). In 2003, FDA and FSIS published a quantitative assessment of the relative risk to public health...

  1. Use of epidemiologic data in Integrated Risk Information System (IRIS) assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Persad, Amanda S.; Cooper, Glinda S.

    2008-11-15

    In human health risk assessment, information from epidemiologic studies is typically utilized in the hazard identification step of the risk assessment paradigm. However, in the assessment of many chemicals by the Integrated Risk Information System (IRIS), epidemiologic data, both observational and experimental, have also been used in the derivation of toxicological risk estimates (i.e., reference doses [RfD], reference concentrations [RfC], oral cancer slope factors [CSF] and inhalation unit risks [IUR]). Of the 545 health assessments posted on the IRIS database as of June 2007, 44 assessments derived non-cancer or cancer risk estimates based on human data. RfD and RfC calculationsmore » were based on a spectrum of endpoints from changes in enzyme activity to specific neurological or dermal effects. There are 12 assessments with IURs based on human data, two assessments that extrapolated human inhalation data to derive CSFs and one that used human data to directly derive a CSF. Lung or respiratory cancer is the most common endpoint for cancer assessments based on human data. To date, only one chemical, benzene, has utilized human data for derivation of all three quantitative risk estimates (i.e., RfC, RfD, and dose-response modeling for cancer assessment). Through examples from the IRIS database, this paper will demonstrate how epidemiologic data have been used in IRIS assessments for both adding to the body of evidence in the hazard identification process and in the quantification of risk estimates in the dose-response component of the risk assessment paradigm.« less

  2. REGIONAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION: EVALUATION OF INTEGRATION METHODS AND ASSESSMENTS RESULTS

    EPA Science Inventory

    This report describes methods for quantitative regional assessment developed by the Regional Vulnerability Assessment (ReVA) program. The goal of ReVA is to develop regional-scale assessments of the magnitude, extent, distribution, and uncertainty of current and anticipated envir...

  3. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  4. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  5. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  6. Quantitative risk assessment using empirical vulnerability functions from debris flow event reconstruction

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami

    2010-05-01

    For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of

  7. Integral refractive index imaging of flowing cell nuclei using quantitative phase microscopy combined with fluorescence microscopy.

    PubMed

    Dardikman, Gili; Nygate, Yoav N; Barnea, Itay; Turko, Nir A; Singh, Gyanendra; Javidi, Barham; Shaked, Natan T

    2018-03-01

    We suggest a new multimodal imaging technique for quantitatively measuring the integral (thickness-average) refractive index of the nuclei of live biological cells in suspension. For this aim, we combined quantitative phase microscopy with simultaneous 2-D fluorescence microscopy. We used 2-D fluorescence microscopy to localize the nucleus inside the quantitative phase map of the cell, as well as for measuring the nucleus radii. As verified offline by both 3-D confocal fluorescence microscopy and 2-D fluorescence microscopy while rotating the cells during flow, the nucleus of cells in suspension that are not during division can be assumed to be an ellipsoid. The entire shape of a cell in suspension can be assumed to be a sphere. Then, the cell and nucleus 3-D shapes can be evaluated based on their in-plain radii available from the 2-D phase and fluorescent measurements, respectively. Finally, the nucleus integral refractive index profile is calculated. We demonstrate the new technique on cancer cells, obtaining nucleus refractive index values that are lower than those of the cytoplasm, coinciding with recent findings. We believe that the proposed technique has the potential to be used for flow cytometry, where full 3-D refractive index tomography is too slow to be implemented during flow.

  8. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  9. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  10. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  11. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  12. Biomechanical Assessment of the Canadian Integrated Load Carriage System using Objective Assessment Measures

    DTIC Science & Technology

    2001-05-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11004 TITLE: Biomechanical Assessment of the Canadian Integrated Load...ADP010987 thru ADPO11009 UNCLASSIFIED 21-1 Biomechanical Assessment of the Canadian Integrated Load Carriage System using Objective Assessment Measures Joan...CANADA, B3J 2X4 Summary The purpose of this study was to provide an overview of contributions by biomechanical testing to the design of the final

  13. Integrating Conceptual and Quantitative Knowledge

    ERIC Educational Resources Information Center

    Metzgar, Matthew

    2013-01-01

    There has been an emphasis in some science courses to focus more on teaching conceptual knowledge. Though certain innovations have been successful in increasing student conceptual knowledge, performance on quantitative problem-solving tasks often remains unaffected. Research also shows that students tend to maintain conceptual and quantitative…

  14. A qualitative and quantitative needs assessment of pain management for hospitalized orthopedic patients.

    PubMed

    Cordts, Grace A; Grant, Marian S; Brandt, Lynsey E; Mears, Simon C

    2011-08-08

    Despite advances in pain management, little formal teaching is given to practitioners and nurses in its use for postoperative orthopedic patients. The goal of our study was to determine the educational needs for orthopedic pain management of our residents, nurses, and physical therapists using a quantitative and qualitative assessment. The needs analysis was conducted in a 10-bed orthopedic unit at a teaching hospital and included a survey given to 20 orthopedic residents, 9 nurses, and 6 physical therapists, followed by focus groups addressing barriers to pain control and knowledge of pain management. Key challenges for nurses included not always having breakthrough pain medication orders and the gap in pain management between cessation of patient-controlled analgesia and ordering and administering oral medications. Key challenges for orthopedic residents included treating pain in patients with a history of substance abuse, assessing pain, and determining when to use long-acting vs short-acting opioids. Focus group assessments revealed a lack of training in pain management and the need for better coordination of care between nurses and practitioners and improved education about special needs groups (the elderly and those with substance abuse issues). This needs assessment showed that orthopedic residents and nurses receive little formal education on pain management, despite having to address pain on a daily basis. This information will be used to develop an educational program to improve pain management for postoperative orthopedic patients. An integrated educational program with orthopedic residents, nurses, and physical therapists would promote understanding of issues for each discipline. Copyright 2011, SLACK Incorporated.

  15. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  16. Risk assessment of integrated electronic health records.

    PubMed

    Bjornsson, Bjarni Thor; Sigurdardottir, Gudlaug; Stefansson, Stefan Orri

    2010-01-01

    The paper describes the security concerns related to Electronic Health Records (EHR) both in registration of data and integration of systems. A description of the current state of EHR systems in Iceland is provided, along with the Ministry of Health's future vision and plans. New legislation provides the opportunity for increased integration of EHRs and further collaboration between institutions. Integration of systems, along with greater availability and access to EHR data, requires increased security awareness since additional risks are introduced. The paper describes the core principles of information security as it applies to EHR systems and data. The concepts of confidentiality, integrity, availability, accountability and traceability are introduced and described. The paper discusses the legal requirements and importance of performing risk assessment for EHR data. Risk assessment methodology according to the ISO/IEC 27001 information security standard is described with examples on how it is applied to EHR systems.

  17. Assessing Quantitative Resistance against Leptosphaeria maculans (Phoma Stem Canker) in Brassica napus (Oilseed Rape) in Young Plants

    PubMed Central

    Huang, Yong-Ju; Qi, Aiming; King, Graham J.; Fitt, Bruce D. L.

    2014-01-01

    Quantitative resistance against Leptosphaeria maculans in Brassica napus is difficult to assess in young plants due to the long period of symptomless growth of the pathogen from the appearance of leaf lesions to the appearance of canker symptoms on the stem. By using doubled haploid (DH) lines A30 (susceptible) and C119 (with quantitative resistance), quantitative resistance against L. maculans was assessed in young plants in controlled environments at two stages: stage 1, growth of the pathogen along leaf veins/petioles towards the stem by leaf lamina inoculation; stage 2, growth in stem tissues to produce stem canker symptoms by leaf petiole inoculation. Two types of inoculum (ascospores; conidia) and three assessment methods (extent of visible necrosis; symptomless pathogen growth visualised using the GFP reporter gene; amount of pathogen DNA quantified by PCR) were used. In stage 1 assessments, significant differences were observed between lines A30 and C119 in area of leaf lesions, distance grown along veins/petioles assessed by visible necrosis or by viewing GFP and amount of L. maculans DNA in leaf petioles. In stage 2 assessments, significant differences were observed between lines A30 and C119 in severity of stem canker and amount of L. maculans DNA in stem tissues. GFP-labelled L. maculans spread more quickly from the stem cortex to the stem pith in A30 than in C119. Stem canker symptoms were produced more rapidly by using ascospore inoculum than by using conidial inoculum. These results suggest that quantitative resistance against L. maculans in B. napus can be assessed in young plants in controlled conditions. Development of methods to phenotype quantitative resistance against plant pathogens in young plants in controlled environments will help identification of stable quantitative resistance for control of crop diseases. PMID:24454767

  18. An Integrated Literature Review of the Knowledge Needs of Parents with Children with Special Health Care Needs and of Instruments to Assess These Needs

    ERIC Educational Resources Information Center

    Adler, Kristin; Salanterä, Sanna; Leino­-Kilpi, Helena; Grädel, Barbara

    2015-01-01

    The purpose of this integrative (including both quantitative and qualitative studies) literature review was to identify knowledge needs of parents of a child with special health care needs and to evaluate instruments to assess these needs. The content analysis of 48 publications revealed a vast amount of knowledge needs that were categorized into…

  19. Perspectives for integrating human and environmental exposure assessments.

    PubMed

    Ciffroy, P; Péry, A R R; Roth, N

    2016-10-15

    Integrated Risk Assessment (IRA) has been defined by the EU FP7 HEROIC Coordination action as "the mutual exploitation of Environmental Risk Assessment for Human Health Risk Assessment and vice versa in order to coherently and more efficiently characterize an overall risk to humans and the environment for better informing the risk analysis process" (Wilks et al., 2015). Since exposure assessment and hazard characterization are the pillars of risk assessment, integrating Environmental Exposure assessment (EEA) and Human Exposure assessment (HEA) is a major component of an IRA framework. EEA and HEA typically pursue different targets, protection goals and timeframe. However, human and wildlife species also share the same environment and they similarly inhale air and ingest water and food through often similar overlapping pathways of exposure. Fate models used in EEA and HEA to predict the chemicals distribution among physical and biological media are essentially based on common properties of chemicals, and internal concentration estimations are largely based on inter-species (i.e. biota-to-human) extrapolations. Also, both EEA and HEA are challenged by increasing scientific complexity and resources constraints. Altogether, these points create the need for a better exploitation of all currently existing data, experimental approaches and modeling tools and it is assumed that a more integrated approach of both EEA and HEA may be part of the solution. Based on the outcome of an Expert Workshop on Extrapolations in Integrated Exposure Assessment organized by the HEROIC project in January 2014, this paper identifies perspectives and recommendations to better harmonize and extrapolate exposure assessment data, models and methods between Human Health and Environmental Risk Assessments to support the further development and promotion of the concept of IRA. Ultimately, these recommendations may feed into guidance showing when and how to apply IRA in the regulatory decision

  20. Health and impact assessment: Are we seeing closer integration?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Richard K., E-mail: rkm@geography.otago.ac.n

    2011-07-15

    Health has always had a place in wider impact assessment activities, from the earliest days of the National Environmental Policy Act in the United States. However, early thinking tended to focus on health protection and environmental health issues, especially in relation to the effects of pollution. The adoption of wider models of health was reflected in impact assessment circles from the early 1990s, with particular emphasis on an integrated approach to impact assessment, especially at the project level, which would see health impact assessment benefiting from working with other forms of impact assessment, such as social and ecological. Yet twentymore » years later, integration still seems a distant prospect in many countries. In this paper I examine the case for integrating health considerations within the wider IA process, discuss some of the problems that have historically restricted progress towards this end, and explore the degree to which impact assessment practitioners have been successful in seeking to improve the consideration of health in IA. In New Zealand, project-level impact assessment is based on an integrated model under the Resource Management Act. In addition, HIA was recognised in the early 1990s as a valuable addition to the toolkit for project assessment. Since then policy-level HIA has grown supported by extensive capacity building. If health is being integrated into wider impact assessment, it should be happening in New Zealand where so many enabling conditions are met. Three major project proposals from New Zealand are examined, to characterise the broad trends in HIA development in New Zealand in the last ten years and to assess the degree to which health concerns are being reflected in wider impact assessments. The findings are discussed in the context of the issues outlined in the early part of the paper.« less

  1. The integration of quantitative information with an intelligent decision support system for residential energy retrofits

    NASA Astrophysics Data System (ADS)

    Mo, Yunjeong

    The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.

  2. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  3. An integrated risk and vulnerability assessment framework for climate change and malaria transmission in East Africa.

    PubMed

    Onyango, Esther Achieng; Sahin, Oz; Awiti, Alex; Chu, Cordia; Mackey, Brendan

    2016-11-11

    Malaria is one of the key research concerns in climate change-health relationships. Numerous risk assessments and modelling studies provide evidence that the transmission range of malaria will expand with rising temperatures, adversely impacting on vulnerable communities in the East African highlands. While there exist multiple lines of evidence for the influence of climate change on malaria transmission, there is insufficient understanding of the complex and interdependent factors that determine the risk and vulnerability of human populations at the community level. Moreover, existing studies have had limited focus on the nature of the impacts on vulnerable communities or how well they are prepared to cope. In order to address these gaps, a systems approach was used to present an integrated risk and vulnerability assessment framework for studies of community level risk and vulnerability to malaria due to climate change. Drawing upon published literature on existing frameworks, a systems approach was applied to characterize the factors influencing the interactions between climate change and malaria transmission. This involved structural analysis to determine influential, relay, dependent and autonomous variables in order to construct a detailed causal loop conceptual model that illustrates the relationships among key variables. An integrated assessment framework that considers indicators of both biophysical and social vulnerability was proposed based on the conceptual model. A major conclusion was that this integrated assessment framework can be implemented using Bayesian Belief Networks, and applied at a community level using both quantitative and qualitative methods with stakeholder engagement. The approach enables a robust assessment of community level risk and vulnerability to malaria, along with contextually relevant and targeted adaptation strategies for dealing with malaria transmission that incorporate both scientific and community perspectives.

  4. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    Water issues and problems have bewildered humankind for a long time yet a systematic approach for understanding such issues remain elusive. This is partly because many water-related problems are framed from a contested terrain in which many actors (individuals, communities, businesses, NGOs, states, and countries) compete to protect their own and often conflicting interests. We argue that origin of many water problems may be understood as a dynamic consequence of competition, interconnections, and feedback among variables in the Natural and Societal Systems (NSSs). Within the natural system, we recognize that triple constraints on water- water quantity (Q), water quality (P), and ecosystem (E)- and their interdependencies and feedback may lead to conflicts. Such inherent and multifaceted constraints of the natural water system are exacerbated often at the societal boundaries. Within the societal system, interdependencies and feedback among values and norms (V), economy (C), and governance (G) interact in various ways to create intractable contextual differences. The observation that natural and societal systems are linked is not novel. Our argument here, however, is that rigid disciplinary boundaries between these two domains will not produce solutions to the water problems we are facing today. The knowledge needed to address water problems need to go beyond scientific assessment in which societal variables (C, G, and V) are treated as exogenous or largely ignored, and policy research that does not consider the impact of natural variables (E, P, and Q) and that coupling among them. Consequently, traditional quantitative methods alone are not appropriate to address the dynamics of water conflicts, because we cannot quantify the societal variables and the exact mathematical relationships among the variables are not fully known. On the other hand, conventional qualitative study in societal domain has mainly been in the form of individual case studies and therefore

  5. Oufti: An integrated software package for high-accuracy, high-throughput quantitative microscopy analysis

    PubMed Central

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-01-01

    Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  6. Novel quantitative assessment of metamorphopsia in maculopathy.

    PubMed

    Wiecek, Emily; Lashkari, Kameran; Dakin, Steven C; Bex, Peter

    2014-11-18

    Patients with macular disease often report experiencing metamorphopsia (visual distortion). Although typically measured with Amsler charts, more quantitative assessments of perceived distortion are desirable to effectively monitor the presence, progression, and remediation of visual impairment. Participants with binocular (n = 33) and monocular (n = 50) maculopathy across seven disease groups, and control participants (n = 10) with no identifiable retinal disease completed a modified Amsler grid assessment (presented on a computer screen with eye tracking to ensure fixation compliance) and two novel assessments to measure metamorphopsia in the central 5° of visual field. A total of 81% (67/83) of participants completed a hyperacuity task where they aligned eight dots in the shape of a square, and 64% (32/50) of participants with monocular distortion completed a spatial alignment task using dichoptic stimuli. Ten controls completed all tasks. Horizontal and vertical distortion magnitudes were calculated for each of the three assessments. Distortion magnitudes were significantly higher in patients than controls in all assessments. There was no significant difference in magnitude of distortion across different macular diseases. There were no significant correlations between overall magnitude of distortion among any of the three measures and no significant correlations in localized measures of distortion. Three alternative quantifications of monocular spatial distortion in the central visual field generated uncorrelated estimates of visual distortion. It is therefore unlikely that metamorphopsia is caused solely by retinal displacement, but instead involves additional top-down information, knowledge about the scene, and perhaps, cortical reorganization. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.

  7. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    NASA Astrophysics Data System (ADS)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a

  8. Local soil quality assessment of north-central Namibia: integrating farmers' and technical knowledge

    NASA Astrophysics Data System (ADS)

    Prudat, Brice; Bloemertz, Lena; Kuhn, Nikolaus J.

    2018-02-01

    Soil degradation is a major threat for farmers of semi-arid north-central Namibia. Soil conservation practices can be promoted by the development of soil quality (SQ) evaluation toolboxes that provide ways to evaluate soil degradation. However, such toolboxes must be adapted to local conditions to reach farmers. Based on qualitative (interviews and soil descriptions) and quantitative (laboratory analyses) data, we developed a set of SQ indicators relevant for our study area that integrates farmers' field experiences (FFEs) and technical knowledge. We suggest using participatory mapping to delineate soil units (Oshikwanyama soil units, KwSUs) based on FFEs, which highlight mostly soil properties that integrate long-term productivity and soil hydrological characteristics (i.e. internal SQ). The actual SQ evaluation of a location depends on the KwSU described and is thereafter assessed by field soil texture (i.e. chemical fertility potential) and by soil colour shade (i.e. SOC status). This three-level information aims to reveal SQ improvement potential by comparing, for any location, (a) estimated clay content against median clay content (specific to KwSU) and (b) soil organic status against calculated optimal values (depends on clay content). The combination of farmers' and technical assessment cumulates advantages of both systems of knowledge, namely the integrated long-term knowledge of the farmers and a short- and medium-term SQ status assessment. The toolbox is a suggestion for evaluating SQ and aims to help farmers, rural development planners and researchers from all fields of studies understanding SQ issues in north-central Namibia. This suggested SQ toolbox is adapted to a restricted area of north-central Namibia, but similar tools could be developed in most areas where small-scale agriculture prevails.

  9. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  10. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an

  11. Assessing the integrity of spillway foundations

    NASA Astrophysics Data System (ADS)

    Hsu, Keng-Tsang; Chiang, Chih-Hung; Cheng, Chia-Chi

    2017-02-01

    The erosion under a spillway can be a long-term issue that threatens the structural integrity of a water reservoir. The spillways under investigation were suspected to be defective after they had been commissioned in 1987 and 1939, respectively. Potholes and subsurface cavities were confirmed in the safety assessment using various NDT techniques including ground penetrating radar and impact echo. The GPR inspection was able to differentiate the intact region from the cavities under concrete slabs. The impact echo results and associated analyses provided further evidence of inferior condition in the soil under the concrete slabs. The engineering team designed and executed the repair projects based on the conclusion of the integrity assessment. Repetitive GPR scans were also carried out after the rehabilitation of spillways. Not only the quality of repair can be evaluated, the scans also provided a baseline record for long-term condition assessment of the spillway and the reservoir in the future.

  12. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Jay P.; Lei, Xiudong; Huang, Sheng-Cheng

    Purpose: To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. Methods and Materials: From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomesmore » Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Results: Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Conclusions: Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting.« less

  13. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation.

    PubMed

    Reddy, Jay P; Lei, Xiudong; Huang, Sheng-Cheng; Nicklaus, Krista M; Fingeret, Michelle C; Shaitelman, Simona F; Hunt, Kelly K; Buchholz, Thomas A; Merchant, Fatima; Markey, Mia K; Smith, Benjamin D

    2017-04-01

    To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomes Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  15. Multimodality Data Integration in Epilepsy

    PubMed Central

    Muzik, Otto; Chugani, Diane C.; Zou, Guangyu; Hua, Jing; Lu, Yi; Lu, Shiyong; Asano, Eishi; Chugani, Harry T.

    2007-01-01

    An important goal of software development in the medical field is the design of methods which are able to integrate information obtained from various imaging and nonimaging modalities into a cohesive framework in order to understand the results of qualitatively different measurements in a larger context. Moreover, it is essential to assess the various features of the data quantitatively so that relationships in anatomical and functional domains between complementing modalities can be expressed mathematically. This paper presents a clinically feasible software environment for the quantitative assessment of the relationship among biochemical functions as assessed by PET imaging and electrophysiological parameters derived from intracranial EEG. Based on the developed software tools, quantitative results obtained from individual modalities can be merged into a data structure allowing a consistent framework for advanced data mining techniques and 3D visualization. Moreover, an effort was made to derive quantitative variables (such as the spatial proximity index, SPI) characterizing the relationship between complementing modalities on a more generic level as a prerequisite for efficient data mining strategies. We describe the implementation of this software environment in twelve children (mean age 5.2 ± 4.3 years) with medically intractable partial epilepsy who underwent both high-resolution structural MR and functional PET imaging. Our experiments demonstrate that our approach will lead to a better understanding of the mechanisms of epileptogenesis and might ultimately have an impact on treatment. Moreover, our software environment holds promise to be useful in many other neurological disorders, where integration of multimodality data is crucial for a better understanding of the underlying disease mechanisms. PMID:17710251

  16. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  17. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  18. Integrated track stability assessment and monitoring system (ITSAMS).

    DOT National Transportation Integrated Search

    2006-10-01

    The overall objective of project is to continue the development of remote sensing : technologies that can be integrated and deployed in a mobile inspection vehicle i.e. Integrated : Track Stability Assessment and Monitoring System (ITSAMS).

  19. The value of assessing pulmonary venous flow velocity for predicting severity of mitral regurgitation: A quantitative assessment integrating left ventricular function

    NASA Technical Reports Server (NTRS)

    Pu, M.; Griffin, B. P.; Vandervoort, P. M.; Stewart, W. J.; Fan, X.; Cosgrove, D. M.; Thomas, J. D.

    1999-01-01

    Although alteration in pulmonary venous flow has been reported to relate to mitral regurgitant severity, it is also known to vary with left ventricular (LV) systolic and diastolic dysfunction. There are few data relating pulmonary venous flow to quantitative indexes of mitral regurgitation (MR). The object of this study was to assess quantitatively the accuracy of pulmonary venous flow for predicting MR severity by using transesophageal echocardiographic measurement in patients with variable LV dysfunction. This study consisted of 73 patients undergoing heart surgery with mild to severe MR. Regurgitant orifice area (ROA), regurgitant stroke volume (RSV), and regurgitant fraction (RF) were obtained by quantitative transesophageal echocardiography and proximal isovelocity surface area. Both left and right upper pulmonary venous flow velocities were recorded and their patterns classified by the ratio of systolic to diastolic velocity: normal (>/=1), blunted (<1), and systolic reversal (<0). Twenty-three percent of patients had discordant patterns between the left and right veins. When the most abnormal patterns either in the left or right vein were used for analysis, the ratio of peak systolic to diastolic flow velocity was negatively correlated with ROA (r = -0.74, P <.001), RSV (r = -0.70, P <.001), and RF (r = -0.66, P <.001) calculated by the Doppler thermodilution method; values were r = -0.70, r = -0.67, and r = -0.57, respectively (all P <.001), for indexes calculated by the proximal isovelocity surface area method. The sensitivity, specificity, and predictive values of the reversed pulmonary venous flow pattern for detecting a large ROA (>0.3 cm(2)) were 69%, 98%, and 97%, respectively. The sensitivity, specificity, and predictive values of the normal pulmonary venous flow pattern for detecting a small ROA (<0.3 cm(2)) were 60%, 96%, and 94%, respectively. However, the blunted pattern had low sensitivity (22%), specificity (61%), and predictive values (30

  20. Utilization of Integrated Assessment Modeling for determining geologic CO2 storage security

    NASA Astrophysics Data System (ADS)

    Pawar, R.

    2017-12-01

    Geologic storage of carbon dioxide (CO2) has been extensively studied as a potential technology to mitigate atmospheric concentration of CO2. Multiple international research & development efforts, large-scale demonstration and commercial projects are helping advance the technology. One of the critical areas of active investigation is prediction of long-term CO2 storage security and risks. A quantitative methodology for predicting a storage site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale projects where projects will require quantitative assessments of potential long-term liabilities. These predictions are challenging given that they require simulating CO2 and in-situ fluid movements as well as interactions through the primary storage reservoir, potential leakage pathways (such as wellbores, faults, etc.) and shallow resources such as groundwater aquifers. They need to take into account the inherent variability and uncertainties at geologic sites. This talk will provide an overview of an approach based on integrated assessment modeling (IAM) to predict long-term performance of a geologic storage site including, storage reservoir, potential leakage pathways and shallow groundwater aquifers. The approach utilizes reduced order models (ROMs) to capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. Applicability of the approach will be demonstrated through examples that are focused on key storage security questions such as what is the probability of leakage of CO2 from a storage reservoir? how does storage security vary for different geologic environments and operational conditions? how site parameter variability and uncertainties affect storage security, etc.

  1. Quantitative cerebral perfusion assessment using microscope-integrated analysis of intraoperative indocyanine green fluorescence angiography versus positron emission tomography in superficial temporal artery to middle cerebral artery anastomosis.

    PubMed

    Kobayashi, Shinya; Ishikawa, Tatsuya; Tanabe, Jun; Moroi, Junta; Suzuki, Akifumi

    2014-01-01

    Intraoperative qualitative indocyanine green (ICG) angiography has been used in cerebrovascular surgery. Hyperperfusion may lead to neurological complications after superficial temporal artery to middle cerebral artery (STA-MCA) anastomosis. The purpose of this study is to quantitatively evaluate intraoperative cerebral perfusion using microscope-integrated dynamic ICG fluorescence analysis, and to assess whether this value predicts hyperperfusion syndrome (HPS) after STA-MCA anastomosis. Ten patients undergoing STA-MCA anastomosis due to unilateral major cerebral artery occlusive disease were included. Ten patients with normal cerebral perfusion served as controls. The ICG transit curve from six regions of interest (ROIs) on the cortex, corresponding to ROIs on positron emission tomography (PET) study, was recorded. Maximum intensity (IMAX), cerebral blood flow index (CBFi), rise time (RT), and time to peak (TTP) were evaluated. RT/TTP, but not IMAX or CBFi, could differentiate between control and study subjects. RT/TTP correlated (|r| = 0.534-0.807; P < 0.01) with mean transit time (MTT)/MTT ratio in the ipsilateral to contralateral hemisphere by PET study. Bland-Altman analysis showed a wide limit of agreement between RT and MTT and between TTP and MTT. The ratio of RT before and after bypass procedures was significantly lower in patients with postoperative HPS than in patients without postoperative HPS (0.60 ± 0.032 and 0.80 ± 0.056, respectively; P = 0.017). The ratio of TTP was also significantly lower in patients with postoperative HPS than in patients without postoperative HPS (0.64 ± 0.081 and 0.85 ± 0.095, respectively; P = 0.017). Time-dependent intraoperative parameters from the ICG transit curve provide quantitative information regarding cerebral circulation time with quality and utility comparable to information obtained by PET. These parameters may help predict the occurrence of postoperative HPS.

  2. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  3. Climate Dynamics and Experimental Prediction (CDEP) and Regional Integrated Science Assessments (RISA) Programs at NOAA Office of Global Programs

    NASA Astrophysics Data System (ADS)

    Bamzai, A.

    2003-04-01

    This talk will highlight science and application activities of the CDEP and RISA programs at NOAA OGP. CDEP, through a set of Applied Research Centers (ARCs), supports NOAA's program of quantitative assessments and predictions of global climate variability and its regional implications on time scales of seasons to centuries. The RISA program consolidates results from ongoing disciplinary process research under an integrative framework. Examples of joint CDEP-RISA activities will be presented. Future directions and programmatic challenges will also be discussed.

  4. Quantitative 3D breast magnetic resonance imaging fibroglandular tissue analysis and correlation with qualitative assessments: a feasibility study.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-04-01

    The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized

  5. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  6. Who collaborates and why: Assessment and diagnostic of governance network integration for salmon restoration in Puget Sound, USA.

    PubMed

    Sayles, Jesse S; Baggio, Jacopo A

    2017-01-15

    Governance silos are settings in which different organizations work in isolation and avoid sharing information and strategies. Siloes are a fundamental challenge for environmental planning and problem solving, which generally requires collaboration. Siloes can be overcome by creating governance networks. Studying the structure and function of these networks is important for understanding how to create institutional arrangements that can respond to the biophysical dynamics of a specific natural resource system (i.e., social-ecological, or institutional fit). Using the case of salmon restoration in a sub-basin of Puget Sound, USA, we assess network integration, considering three different reasons for network collaborations (i.e., mandated, funded, and shared interest relationships) and analyze how these different collaboration types relate to productivity based on practitioner's assessments. We also illustrate how specific and targeted network interventions might enhance the network. To do so, we use a mixed methods approach that combines quantitative social network analysis (SNA) and qualitative interview analysis. Overall, the sub-basin's governance network is fairly well integrated, but several concerning gaps exist. Funded, mandated, and shared interest relationships lead to different network patterns. Mandated relationships are associated with lower productivity than shared interest relationships, highlighting the benefit of genuine collaboration in collaborative watershed governance. Lastly, quantitative and qualitative data comparisons strengthen recent calls to incorporate geographic space and the role of individual actors versus organizational culture into natural resource governance research using SNA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  8. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  9. Using an integrated approach to the assessment of the psychosocial work environment: the case of a major hospital in northern Italy.

    PubMed

    Lanfranchi, Fiorella; Alaimo, Sara; Conway, P M

    2014-01-01

    In 2010, Italian regulatory guidelines have been issued consisting of a stepwise procedure for the assessment and management of work-related stress. However, research that empirically examines whether this procedure proves effective in accurately identifying critical psychosocial factors and informing risk management is scarce. To examine the differential sensitivity of two approaches to risk assessment, the first based on objective instruments only, the second consisting of an integrated approach combining different methods and theoretical perspectives. We examined a sample of 306 healthcare employees in a large-size hospital in northern Italy, using a series of tools, both quantitative (an observational checklist and the HSE-IT and MOHQ questionnaires) and qualitative (Focus Groups). Through instrument-specific reference values, we then compared risk profiles between different homogeneous groups within the institution. The psychosocial work environment resulted to be far more positive when adopting the first compared to the second approach to risk assessment. The latter approach was also more sensitive in detecting between-groups differences in risk profiles. Furthermore, the Focus Groups returned a more context-specific picture of the psychosocial work environment. Finally, going beyond the emphasis on negative working conditions inherent in the other quantitative instruments, the MOHQ allowed for also identifying health-promoting factors in need for improvement. Although more research is needed to confirm our findings, the present study suggests that using an integrated approach to assess the psychosocial work environment may be the most effective way to accurately identify risk factors and support the management process.

  10. Advantages of integrated and sustainability based assessment for metabolism based strategic planning of urban water systems.

    PubMed

    Behzadian, Kourosh; Kapelan, Zoran

    2015-09-15

    Despite providing water-related services as the primary purpose of urban water system (UWS), all relevant activities require capital investments and operational expenditures, consume resources (e.g. materials and chemicals), and may increase negative environmental impacts (e.g. contaminant discharge, emissions to water and air). Performance assessment of such a metabolic system may require developing a holistic approach which encompasses various system elements and criteria. This paper analyses the impact of integration of UWS components on the metabolism based performance assessment for future planning using a number of intervention strategies. It also explores the importance of sustainability based criteria in the assessment of long-term planning. Two assessment approaches analysed here are: (1) planning for only water supply system (WSS) as a part of the UWS and (2) planning for an integrated UWS including potable water, stormwater, wastewater and water recycling. WaterMet(2) model is used to simulate metabolic type processes in the UWS and calculate quantitative performance indicators. The analysis is demonstrated on the problem of strategic level planning of a real-world UWS to where optional intervention strategies are applied. The resulting performance is assessed using the multiple criteria of both conventional and sustainability type; and optional intervention strategies are then ranked using the Compromise Programming method. The results obtained show that the high ranked intervention strategies in the integrated UWS are those supporting both water supply and stormwater/wastewater subsystems (e.g. rainwater harvesting and greywater recycling schemes) whilst these strategies are ranked low in the WSS and those targeting improvement of water supply components only (e.g. rehabilitation of clean water pipes and addition of new water resources) are preferred instead. Results also demonstrate that both conventional and sustainability type performance indicators

  11. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  12. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  13. Quantitative motor assessment of muscular weakness in myasthenia gravis: a pilot study.

    PubMed

    Hoffmann, Sarah; Siedler, Jana; Brandt, Alexander U; Piper, Sophie K; Kohler, Siegfried; Sass, Christian; Paul, Friedemann; Reilmann, Ralf; Meisel, Andreas

    2015-12-23

    Muscular weakness in myasthenia gravis (MG) is commonly assessed using Quantitative Myasthenia Gravis Score (QMG). More objective and quantitative measures may complement the use of clinical scales and might detect subclinical affection of muscles. We hypothesized that muscular weakness in patients with MG can be quantified with the non-invasive Quantitative Motor (Q-Motor) test for Grip Force Assessment (QGFA) and Involuntary Movement Assessment (QIMA) and that pathological findings correlate with disease severity as measured by QMG. This was a cross-sectional pilot study investigating patients with confirmed diagnosis of MG. Data was compared to healthy controls (HC). Subjects were asked to lift a device (250 and 500 g) equipped with electromagnetic sensors that measured grip force (GF) and three-dimensional changes in position and orientation. These were used to calculate the position index (PI) and orientation index (OI) as measures for involuntary movements due to muscular weakness. Overall, 40 MG patients and 23 HC were included. PI and OI were significantly higher in MG patients for both weights in the dominant and non-dominant hand. Subgroup analysis revealed that patients with clinically ocular myasthenia gravis (OMG) also showed significantly higher values for PI and OI in both hands and for both weights. Disease severity correlates with QIMA performance in the non-dominant hand. Q-Motor tests and particularly QIMA may be useful objective tools for measuring motor impairment in MG and seem to detect subclinical generalized motor signs in patients with OMG. Q-Motor parameters might serve as sensitive endpoints for clinical trials in MG.

  14. Testing Natureserve's ecological integrity assessment model in Michigan and Indiana

    EPA Science Inventory

    NatureServe, in partnership with member programs from the Natural Heritage Network and federal agencies, has developed an assessment of ecosystems condition, structured around the concept of ecological integrity. Our multi-metric approach for our Ecological Integrity Assessment m...

  15. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  16. A CONCEPT MAP FOR INTEGRATED ENVIRONMENTAL ASSESSMENT AND FUTURES MODELING

    EPA Science Inventory

    Integrated assessment models are differentiated from other models by their explicit concern for results that are useful to decision makers. While the details will differ greatly for each particular integrated assessments project, there are certain concepts that will be present f...

  17. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  18. Integrated Smartphone-App-Chip System for On-Site Parts-Per-Billion-Level Colorimetric Quantitation of Aflatoxins.

    PubMed

    Li, Xiaochun; Yang, Fan; Wong, Jessica X H; Yu, Hua-Zhong

    2017-09-05

    We demonstrate herein an integrated, smartphone-app-chip (SPAC) system for on-site quantitation of food toxins, as demonstrated with aflatoxin B1 (AFB1), at parts-per-billion (ppb) level in food products. The detection is based on an indirect competitive immunoassay fabricated on a transparent plastic chip with the assistance of a microfluidic channel plate. A 3D-printed optical accessory attached to a smartphone is adapted to align the assay chip and to provide uniform illumination for imaging, with which high-quality images of the assay chip are captured by the smartphone camera and directly processed using a custom-developed Android app. The performance of this smartphone-based detection system was tested using both spiked and moldy corn samples; consistent results with conventional enzyme-linked immunosorbent assay (ELISA) kits were obtained. The achieved detection limit (3 ± 1 ppb, equivalent to μg/kg) and dynamic response range (0.5-250 ppb) meet the requested testing standards set by authorities in China and North America. We envision that the integrated SPAC system promises to be a simple and accurate method of food toxin quantitation, bringing much benefit for rapid on-site screening.

  19. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens.

    PubMed

    Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B

    2003-05-25

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the

  20. Tool for Human-Systems Integration Assessment: HSI Scorecard

    NASA Technical Reports Server (NTRS)

    Whitmore, Nihriban; Sandor, Aniko; McGuire, Kerry M.; Berdich, Debbie

    2009-01-01

    This paper describes the development and rationale for a human-systems integration (HSI) scorecard that can be used in reviews of vehicle specification and design. This tool can be used to assess whether specific HSI related criteria have been met as part of a project milestone or critical event, such as technical reviews, crew station reviews, mockup evaluations, or even review of major plans or processes. Examples of HSI related criteria include Human Performance Capabilities, Health Management, Human System Interfaces, Anthropometry and Biomechanics, and Natural and Induced Environments. The tool is not intended to evaluate requirements compliance and verification, but to review how well the human related systems have been considered for the specific event and to identify gaps and vulnerabilities from an HSI perspective. The scorecard offers common basis, and criteria for discussions among system managers, evaluators, and design engineers. Furthermore, the scorecard items highlight the main areas of system development that need to be followed during system lifecycle. The ratings provide a repeatable quantitative measure to what has been often seen as only subjective commentary. Thus, the scorecard is anticipated to be a useful HSI tool to communicate review results to the institutional and the project office management.

  1. An integrative strategy for quantitative analysis of the N-glycoproteome in complex biological samples.

    PubMed

    Wang, Ji; Zhou, Chuang; Zhang, Wei; Yao, Jun; Lu, Haojie; Dong, Qiongzhu; Zhou, Haijun; Qin, Lunxiu

    2014-01-15

    The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10-10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023-0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes.

  2. Total Risk Integrated Methodology (TRIM) - TRIM.Risk

    EPA Pesticide Factsheets

    TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.

  3. Methods of quantitative risk assessment: The case of the propellant supply system

    NASA Astrophysics Data System (ADS)

    Merz, H. A.; Bienz, A.

    1984-08-01

    As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.

  4. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boquerón ...

  5. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    PubMed

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  6. COMPREHENSIVE ASSESSMENT OF COMPLEX TECHNOLOGIES: INTEGRATING VARIOUS ASPECTS IN HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Lysdahl, Kristin Bakke; Mozygemba, Kati; Burns, Jacob; Brönneke, Jan Benedikt; Chilcott, James B; Ward, Sue; Hofmann, Bjørn

    2017-01-01

    Despite recent development of health technology assessment (HTA) methods, there are still methodological gaps for the assessment of complex health technologies. The INTEGRATE-HTA guidance for effectiveness, economic, ethical, socio-cultural, and legal aspects, deals with challenges when assessing complex technologies, such as heterogeneous study designs, multiple stakeholder perspectives, and unpredictable outcomes. The objective of this article is to outline this guidance and describe the added value of integrating these assessment aspects. Different methods were used to develop the various parts of the guidance, but all draw on existing, published knowledge and were supported by stakeholder involvement. The guidance was modified after application in a case study and in response to feedback from internal and external reviewers. The guidance consists of five parts, addressing five core aspects of HTA, all presenting stepwise approaches based on the assessment of complexity, context, and stakeholder involvement. The guidance on effectiveness, health economics and ethics aspects focus on helping users choose appropriate, or further develop, existing methods. The recommendations are based on existing methods' applicability for dealing with problems arising with complex interventions. The guidance offers new frameworks to identify socio-cultural and legal issues, along with overviews of relevant methods and sources. The INTEGRATE-HTA guidance outlines a wide range of methods and facilitates appropriate choices among them. The guidance enables understanding of how complexity matters for HTA and brings together assessments from disciplines, such as epidemiology, economics, ethics, law, and social theory. This indicates relevance for a broad range of technologies.

  7. Towards an Integrated Academic Assessment: Closing Employers' Expectations?

    ERIC Educational Resources Information Center

    Lim, Ngat-Chin

    2015-01-01

    Purpose: The purpose of this paper is to showcase that the integration of academic assessment with workplace performance appraisal practices can help to address the gap between graduate employability skills and employers' requirements. Employability refers to learning of transferable skills. Design/Methodology/Approach: The integrated assessment…

  8. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  9. Integration of Mobile AR Technology in Performance Assessment

    ERIC Educational Resources Information Center

    Kuo-Hung, Chao; Kuo-En, Chang; Chung-Hsien, Lan; Kinshuk; Yao-Ting, Sung

    2016-01-01

    This study was aimed at exploring how to use augmented reality (AR) technology to enhance the effect of performance assessment (PA). A mobile AR performance assessment system (MARPAS) was developed by integrating AR technology to reduce the limitations in observation and assessment during PA. This system includes three modules: Authentication, AR…

  10. A Problem-Solving Template for Integrating Qualitative and Quantitative Physics Instruction

    ERIC Educational Resources Information Center

    Fink, Janice M.; Mankey, Gary J.

    2010-01-01

    A problem-solving template enables a methodology of instruction that integrates aspects of both sequencing and conceptual learning. It is designed to enhance critical-thinking skills when used within the framework of a learner-centered approach to teaching, where regular, thorough assessments of student learning are key components of the…

  11. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Healthy Watersheds Integrated Assessments Workshop Proceedings

    EPA Science Inventory

    The Healthy Watershed Integrated Assessment Workshop was held in Estes Park, Colorado in November 2010. Attendees were selected to represent interests and expertise of EPA’s Office of Water, EPA’s Office of Research and Development, EPA Regions, States, other Federal, State, and...

  13. Fostering Curriculum Integration through Performance Assessment.

    ERIC Educational Resources Information Center

    Aseltine, James M.

    1994-01-01

    Several barriers may prevent teachers from using an integrated curriculum, including insufficient preparation and an individualistic or accountability-driven school culture. A Farmington, Connecticut, middle school encourages its teachers to develop an interdisciplinary curriculum aligned with performance assessment. Teachers receive training in a…

  14. An overview of data integration methods for regional assessment.

    PubMed

    Locantore, Nicholas W; Tran, Liem T; O'Neill, Robert V; McKinnis, Peter W; Smith, Elizabeth R; O'Connell, Michael

    2004-06-01

    The U.S. Environmental Protections Agency's (U.S. EPA) Regional Vulnerability Assessment(ReVA) program has focused much of its research over the last five years on developing and evaluating integration methods for spatial data. An initial strategic priority was to use existing data from monitoring programs, model results, and other spatial data. Because most of these data were not collected with an intention of integrating into a regional assessment of conditions and vulnerabilities, issues exist that may preclude the use of some methods or require some sort of data preparation. Additionally, to support multi-criteria decision-making, methods need to be able to address a series of assessment questions that provide insights into where environmental risks are a priority. This paper provides an overview of twelve spatial integration methods that can be applied towards regional assessment, along with preliminary results as to how sensitive each method is to data issues that will likely be encountered with the use of existing data.

  15. Integrated Science Assessment (ISA) for Carbon Monoxide ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Integrated Science Assessment (ISA) for Carbon Monoxide (CO). This report is EPA’s latest evaluation of the scientific literature on the potential human health and welfare effects associated with ambient exposures to CO. The development of this document is part of the Agency's periodic review of the national ambient air quality standards (NAAQS) for CO. The recently completed CO ISA and supplementary annexes, in conjunction with additional technical and policy assessments developed by EPA’s Office of Air and Radiation, will provide the scientific basis to inform EPA decisions related to the review of the current CO NAAQS. The integrated Plan for Review of the National Ambient Air Quality Standards for Carbon Monoxide (U.S. EPA, 2008, 193995) identifies key policy-relevant questions that provide a framework for this assessment of the scientific evidence. These questions frame the entire review of the NAAQS for CO and thus are informed by both science and policy considerations. The ISA organizes, presents, and integrates the scientific evidence which is considered along with findings from risk analyses and policy considerations to help the U.S. Environmental Protection Agency (EPA) address these questions during the NAAQS review.

  16. An Integrative Platform for Three-dimensional Quantitative Analysis of Spatially Heterogeneous Metastasis Landscapes

    NASA Astrophysics Data System (ADS)

    Guldner, Ian H.; Yang, Lin; Cowdrick, Kyle R.; Wang, Qingfei; Alvarez Barrios, Wendy V.; Zellmer, Victoria R.; Zhang, Yizhe; Host, Misha; Liu, Fang; Chen, Danny Z.; Zhang, Siyuan

    2016-04-01

    Metastatic microenvironments are spatially and compositionally heterogeneous. This seemingly stochastic heterogeneity provides researchers great challenges in elucidating factors that determine metastatic outgrowth. Herein, we develop and implement an integrative platform that will enable researchers to obtain novel insights from intricate metastatic landscapes. Our two-segment platform begins with whole tissue clearing, staining, and imaging to globally delineate metastatic landscape heterogeneity with spatial and molecular resolution. The second segment of our platform applies our custom-developed SMART 3D (Spatial filtering-based background removal and Multi-chAnnel forest classifiers-based 3D ReconsTruction), a multi-faceted image analysis pipeline, permitting quantitative interrogation of functional implications of heterogeneous metastatic landscape constituents, from subcellular features to multicellular structures, within our large three-dimensional (3D) image datasets. Coupling whole tissue imaging of brain metastasis animal models with SMART 3D, we demonstrate the capability of our integrative pipeline to reveal and quantify volumetric and spatial aspects of brain metastasis landscapes, including diverse tumor morphology, heterogeneous proliferative indices, metastasis-associated astrogliosis, and vasculature spatial distribution. Collectively, our study demonstrates the utility of our novel integrative platform to reveal and quantify the global spatial and volumetric characteristics of the 3D metastatic landscape with unparalleled accuracy, opening new opportunities for unbiased investigation of novel biological phenomena in situ.

  17. Rapid, automated, parallel quantitative immunoassays using highly integrated microfluidics and AlphaLISA

    PubMed Central

    Tak For Yu, Zeta; Guan, Huijiao; Ki Cheung, Mei; McHugh, Walker M.; Cornell, Timothy T.; Shanley, Thomas P.; Kurabayashi, Katsuo; Fu, Jianping

    2015-01-01

    Immunoassays represent one of the most popular analytical methods for detection and quantification of biomolecules. However, conventional immunoassays such as ELISA and flow cytometry, even though providing high sensitivity and specificity and multiplexing capability, can be labor-intensive and prone to human error, making them unsuitable for standardized clinical diagnoses. Using a commercialized no-wash, homogeneous immunoassay technology (‘AlphaLISA’) in conjunction with integrated microfluidics, herein we developed a microfluidic immunoassay chip capable of rapid, automated, parallel immunoassays of microliter quantities of samples. Operation of the microfluidic immunoassay chip entailed rapid mixing and conjugation of AlphaLISA components with target analytes before quantitative imaging for analyte detections in up to eight samples simultaneously. Aspects such as fluid handling and operation, surface passivation, imaging uniformity, and detection sensitivity of the microfluidic immunoassay chip using AlphaLISA were investigated. The microfluidic immunoassay chip could detect one target analyte simultaneously for up to eight samples in 45 min with a limit of detection down to 10 pg mL−1. The microfluidic immunoassay chip was further utilized for functional immunophenotyping to examine cytokine secretion from human immune cells stimulated ex vivo. Together, the microfluidic immunoassay chip provides a promising high-throughput, high-content platform for rapid, automated, parallel quantitative immunosensing applications. PMID:26074253

  18. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  19. Laser gingival retraction: a quantitative assessment.

    PubMed

    Krishna Ch, Vamsi; Gupta, Nidhi; Reddy, K Mahendranadh; Sekhar, N Chandra; Aditya, Venkata; Reddy, G V K Mohan

    2013-08-01

    Proper gingival retraction improves the prognosis of crowns and bridges with sub gingival finishlines.Use of lasers assists the operator to achieve proper retraction with good clinical results. The present study was intended to assess the amount of lateral gingival retraction achieved quantitatively by using diode lasers. Study was carried on 20 patients attended to a dental institution that underwent root canal treatment and indicated for fabrication of crowns. Gingival retraction was carried out on 20 teeth and elastomeric impressions were obtained. Models retrieved from the impressions were sectioned and the lateral distance between finish line and the marginal gingival was measured using tool makers microscope. Retraction was measured in mid buccal, mesio buccal and disto buccal regions. The values obtained were used to calculate the mean lateral retraction in microns. Mean retraction values of 399.5 μm, 445.5 μm and 422.5μm were obtained in mid buccal, mesio buccal and disto buccal regions respectively. Gingival Retraction achieved was closer to the thickness of sulcular epithelium and greater than the minimum required retraction of 200um.

  20. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    PubMed

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  1. An Integrated Environmental Assessment of the US Mid-Atlantic Region

    Treesearch

    James D. Wickham; K.B. Jones; Kurt H. Riitters; R.V. O' Neill; R.D. Tankersley; E.R. Smith; A.C. Neale; D.J. Chaloud

    1999-01-01

    Many of today's environmental problems are in scope and their effects overlap and interact. We developed a simple method to provide an integrated assessment of environmental conditions and estimate cumulative impacts across a large region, by combining data on land-cover, population, roads, streams, air pollution, and topography. The integrated assessment...

  2. Microfluidic platform integrated with worm-counting setup for assessing manganese toxicity

    PubMed Central

    Zhang, Beibei; Li, Yinbao; He, Qidi; Qin, Jun; Yu, Yanyan; Li, Xinchun; Zhang, Lin; Yao, Meicun; Liu, Junshan; Chen, Zuanguang

    2014-01-01

    We reported a new microfluidic system integrated with worm responders for evaluating the environmental manganese toxicity. The micro device consists of worm loading units, worm observing chambers, and a radial concentration gradient generator (CGG). Eight T-shape worm loading units of the micro device were used to load the exact number of worms into the corresponding eight chambers with the assistance of worm responders and doorsills. The worm responder, as a key component, was employed for performing automated worm-counting assay through electric impedance sensing. This label-free and non-invasive worm-counting technique was applied to the microsystem for the first time. In addition, the disk-shaped CGG can generate a range of stepwise concentrations of the appointed chemical automatically and simultaneously. Due to the scalable architecture of radial CGG, it has the potential to increase the throughput of the assay. Dopaminergic (DAergic) neurotoxicity of manganese on C. elegans was quantitatively assessed via the observation of green fluorescence protein-tagged DAergic neurons of the strain BZ555 on-chip. In addition, oxidative stress triggered by manganese was evaluated by the quantitative fluorescence intensity of the strain CL2166. By scoring the survival ratio and stroke frequency of worms, we characterized the dose- and time-dependent mobility defects of the manganese-exposed worms. Furthermore, we applied the microsystem to investigate the effect of natural antioxidants to protect manganese-induced toxicity. PMID:25538805

  3. INTEGRATED RISK ASSESSMENT - RESULTS FROM AN INTERNATIONAL WORKSHOP

    EPA Science Inventory

    The WHO International Programme on Chemical Safety and international partners have developed a framework for integrated assessment of human health and ecological risks and four case studies. An international workshop was convened to consider how ecological and health risk assess...

  4. Quantitative assessment model for gastric cancer screening

    PubMed Central

    Chen, Kun; Yu, Wei-Ping; Song, Liang; Zhu, Yi-Min

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer. METHODS: A case control study was carried on in 66 patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food, etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD). RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively. According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%. Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05). CONCLUSION: The validity of this method is satisfactory. It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer. PMID:15655813

  5. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  6. 241-AY Double Shell Tanks (DST) Integrity Assessment Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENSEN, C.E.

    1999-09-21

    This report presents the results of the integrity assessment of the 241-AY double-shell tank farm facility located in the 200 East Area of the Hanford Site. The assessment included the design evaluation and integrity examinations of the tanks and concluded that the facility is adequately designed, is compatible with the waste, and is fit for use. Recommendations including subsequent examinations. are made to ensure the continued safe operation of the tanks.

  7. An integrative strategy for quantitative analysis of the N-glycoproteome in complex biological samples

    PubMed Central

    2014-01-01

    Background The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. Results The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10–10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023–0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. Conclusions This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes. PMID:24428921

  8. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  9. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  10. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2017-12-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  11. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  12. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  13. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  14. Integrated Assessment of Prevention and Restoration Actions to Combat Desertification

    NASA Astrophysics Data System (ADS)

    Bautista, S.; Orr, B. J.; Vallejo, R.

    2009-12-01

    Recent advances in desertification and land degradation research have provided valuable conceptual and analytical frameworks, degradation indicators, assessment tools and surveillance systems with respect to desertification drivers, processes, and impacts. These findings, together with stakeholders’ perceptions and local/regional knowledge, have helped to define and propose measures and strategies to combat land degradation. However, integrated and comprehensive assessment and evaluation of prevention and restoration strategies and techniques to combat desertification is still lacking, and knowledge on the feasibility and cost-effectiveness of the proposed strategies over a wide range of environmental and socio-economic conditions is very scarce. To address this challenge, we have launched a multinational project (PRACTICE - Prevention and Restoration Actions to Combat Desertification. An Integrated Assessment), funded by the European Commission, in order to link S & T advances and traditional knowledge on prevention and restoration practices to combat desertification with sound implementation, learning and adaptive management, knowledge sharing, and dissemination of best practices. The key activities for pursuing this goal are (1) to establish a platform and information system of long-term monitoring sites for assessing sustainable management and actions to combat desertification, (2) to define an integrated protocol for the assessment of these actions, and (3) to link project assessment and evaluation with training and education, adaptive management, and knowledge sharing and dissemination through a participatory approach involving scientists, managers, technicians, financial officers, and members of the public who are/were impacted by the desertification control projects. Monitoring sites are distributed in the Mediterranean Europe (Greece, Italy, Spain, and Portugal), Africa (Morocco, Namibia, South Africa), Middle East (Israel), China, and South and North

  15. How to assess and prepare health systems in low- and middle-income countries for integration of services—a systematic review

    PubMed Central

    Joshi, Rohina; Negin, Joel

    2018-01-01

    Abstract Despite growing support for integration of frontline services, a lack of information about the pre-conditions necessary to integrate such services hampers the ability of policy makers and implementers to assess how feasible or worthwhile integration may be, especially in low- and middle-income countries (LMICs). We adopted a modified systematic review with aspects of realist review, including quantitative and qualitative studies that incorporated assessment of health system preparedness for and capacity to implement integrated services. We searched Medline via Ovid, Web of Science and the Cochrane library using terms adapted from Dudley and Garner’s systematic review on integration in LMICs. From an initial list of 10 550 articles, 206 were selected for full-text review by two reviewers who independently reviewed articles and inductively extracted and synthesized themes related to health system preparedness. We identified five ‘context’ related categories and four health system ‘capability’ themes. The contextual enabling and constraining factors for frontline service integration were: (1) the organizational framework of frontline services, (2) health care worker preparedness, (3) community and client preparedness, (4) upstream logistics and (5) policy and governance issues. The intersecting health system capabilities identified were the need for: (1) sufficiently functional frontline health services, (2) sufficiently trained and motivated health care workers, (3) availability of technical tools and equipment suitable to facilitate integrated frontline services and (4) appropriately devolved authority and decision-making processes to enable frontline managers and staff to adapt integration to local circumstances. Moving beyond claims that integration is defined differently by different programs and thus unsuitable for comparison, this review demonstrates that synthesis is possible. It presents a common set of contextual factors and health system

  16. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  17. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    NASA Astrophysics Data System (ADS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J.

    2008-08-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle.

  18. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  19. A Preliminary Quantitative Comparison of Vibratory Amplitude Using Rigid and Flexible Stroboscopic Assessment.

    PubMed

    Hosbach-Cannon, Carly J; Lowell, Soren Y; Kelley, Richard T; Colton, Raymond H

    2016-07-01

    The purpose of this study was to establish preliminary, quantitative data on amplitude of vibration during stroboscopic assessment in healthy speakers with normal voice characteristics. Amplitude of vocal fold vibration is a core physiological parameter used in diagnosing voice disorders, yet quantitative data are lacking to guide the determination of what constitutes normal vibratory amplitude. Eleven participants were assessed during sustained vowel production using rigid and flexible endoscopy with stroboscopy. Still images were extracted from digital recordings of a sustained /i/ produced at a comfortable pitch and loudness, with F0 controlled so that levels were within ±15% of each participant's comfortable mean level as determined from connected speech. Glottal width (GW), true vocal fold (TVF) length, and TVF width were measured from still frames representing the maximum open phase of the vibratory cycle. To control for anatomic and magnification differences across participants, GW was normalized to TVF length. GW as a ratio of TVF width was also computed for comparison with prior studies. Mean values and standard deviations were computed for the normalized measures. Paired t tests showed no significant differences between rigid and flexible endoscopy methods. Interrater and intrarater reliability values for raw measurements were found to be high (0.89-0.99). These preliminary quantitative data may be helpful in determining normality or abnormality of vocal fold vibration. Results indicate that quantified amplitude of vibration is similar between endoscopic methods, a clinically relevant finding for individuals performing and interpreting stroboscopic assessments. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  20. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  1. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  2. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  3. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  4. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2017-06-01

    report. 10 Supporting Data None. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI Psychological Health...Award Number: W81XWH-13-1-0095 TITLE: Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI PRINCIPAL INVESTIGATOR...COVERED 08 MAR 2016 – 07 MAR 2017 4. TITLE AND SUBTITLE Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI 5a

  5. Teaching Integrative Physiology Using the Quantitative Circulatory Physiology Model and Case Discussion Method: Evaluation of the Learning Experience

    ERIC Educational Resources Information Center

    Rodriguez-Barbero, A.; Lopez-Novoa, J. M.

    2008-01-01

    One of the problems that we have found when teaching human physiology in a Spanish medical school is that the degree of understanding by the students of the integration between organs and systems is rather poor. We attempted to remedy this problem by using a case discussion method together with the Quantitative Circulatory Physiology (QCP)…

  6. Integrated Technology Assessment Center (ITAC) Update

    NASA Technical Reports Server (NTRS)

    Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)

    2002-01-01

    The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.

  7. Quantitative assessment of paretic limb dexterity and interlimb coordination during bilateral arm rehabilitation training.

    PubMed

    Xu, Chang; Li, Siyi; Wang, Kui; Hou, Zengguang; Yu, Ningbo

    2017-07-01

    In neuro-rehabilitation after stroke, the conventional constrained induced movement therapy (CIMT) has been well-accepted. Existing bilateral trainings are mostly on mirrored symmetrical motion. However, complementary bilateral movements are dominantly involved in activities of daily living (ADLs), and functional bilateral therapies may bring better skill transfer from trainings to daily life. Neurophysiological evidence is also growing. In this work, we firstly introduce our bilateral arm training system realized with a haptic interface and a motion sensor, as well as the tasks that have been designed to train both the manipulation function of the paretic arm and coordination of bilateral upper limbs. Then, we propose quantitative measures for functional assessment of complementary bilateral training performance, including kinematic behavior indices, smoothness, submovement and bimanual coordination. After that, we describe the experiments with healthy subjects and the results with respect to these quantitative measures. Feasibility and sensitivity of the proposed indices were evaluated through comparison of unilateral and bilateral training outcomes. The proposed bilateral training system and tasks, as well as the quantitative measures, have been demonstrated effective for training and assessment of unilateral and bilateral arm functions.

  8. A quantitative assessment of alkaptonuria: testing the reliability of two disease severity scoring systems.

    PubMed

    Cox, Trevor F; Ranganath, Lakshminarayan

    2011-12-01

    Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.

  9. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  10. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    PubMed

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  11. The Development and Validation of the Religious/Spiritually Integrated Practice Assessment Scale

    ERIC Educational Resources Information Center

    Oxhandler, Holly K.; Parrish, Danielle E.

    2016-01-01

    Objective: This article describes the development and validation of the Religious/Spiritually Integrated Practice Assessment Scale (RSIPAS). The RSIPAS is designed to assess social work practitioners' self-efficacy, attitudes, behaviors, and perceived feasibility concerning the assessment or integration of clients' religious and spiritual beliefs…

  12. Assessment of beating parameters in human induced pluripotent stem cells enables quantitative in vitro screening for cardiotoxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirenko, Oksana, E-mail: oksana.sirenko@moldev.com; Cromwell, Evan F., E-mail: evan.cromwell@moldev.com; Crittenden, Carole

    2013-12-15

    evaluation of cardiotoxicity is possible in a high-throughput format. • The assay shows benefits of automated data integration across multiple parameters. • Quantitative assessment of concentration–response is possible using iPSCs. • Multi-parametric screening allows for cardiotoxicity risk assessment.« less

  13. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  14. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  15. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  16. Perceived leader integrity and employee job satisfaction: A quantitative study of U.S. aerospace engineers

    NASA Astrophysics Data System (ADS)

    Harper, Kay E.

    The goal of this quantitative study was to determine if there is a significant relationship between perceived leader integrity and employee job satisfaction. The population selected to be analyzed was U.S. Aerospace engineers. Two existing valid and reliable survey instruments were used to collect data. One of the surveys was the Perceived Leader Integrity Scale developed by Craig and Gustafson. The second survey was the Minnesota Satisfaction Questionnaire created by Weiss, Dawis, England, and Lofquist. The public professional networking site LinkedIn was used to invite U.S. Aerospace engineers to participate. The survey results were monitored by Survey Monkey and the sample data was analyzed using SPSS software. 184 responses were collected and of those, 96 were incomplete. 91 usable survey responses were left to be analyzed. When the results were plotted on an x-y plot, the data line had a slight negative slope. The plotted data showed a very small negative relationship between perceived leader integrity and employee job satisfaction. This relationship could be interpreted to mean that as perceived leader integrity improved, employee job satisfaction decreased only slightly. One explanation for this result could be that employees focused on their negative feelings about their current job assignment when they did not have to be concerned about the level of integrity with which their leader acted. The findings of this study reinforce the importance of employee's perception of a critical leader quality - integrity. For future research, a longitudinal study utilizing another sampling method other than convenience sampling may better statistically capture the relationship between perceived leader integrity and employee job satisfaction for U.S. aerospace engineers.

  17. Assessment and management of ecological integrity: Chapter 12

    USGS Publications Warehouse

    Kwak, Thomas J.; Freeman, Mary C.

    2010-01-01

    Assessing and understanding the impacts of human activities on aquatic ecosystems has long been a focus of ecologists, water resources managers, and fisheries scientists. While traditional fisheries management focused on single-species approaches to enhance fish stocks, there is a growing emphasis on management approaches at community and ecosystem levels. Of course, as fisheries managers shift their attention from narrow (e.g., populations) to broad organizational scales (e.g., communities or ecosystems), ecological processes and management objectives become more complex. At the community level, fisheries managers may strive for a fish assemblage that is complex, persistent, and resilient to disturbance. Aquatic ecosystem level objectives may focus on management for habitat quality and ecological processes, such as nutrient dynamics, productivity, or trophic interactions, but a long-term goal of ecosystem management may be to maintain ecological integrity. However, human users and social, economic, and political demands of fisheries management often result in a reduction of ecological integrity in managed systems, and this conflict presents a principal challenge for the modern fisheries manager. The concepts of biotic integrity and ecological integrity are being applied in fisheries science, natural resource management, and environmental legislation, but explicit definitions of these terms are elusive. Biotic integrity of an ecosystem may be defined as the capability of supporting and maintaining an integrated, adaptive community of organisms having a species composition, diversity, and functional organization comparable to that of a natural habitat of the region (Karr and Dudley 1981). Following that, ecological integrity is the summation of chemical, physical, and biological integrity. Thus, the concept of ecological integrity extends beyond fish and represents a holistic approach for ecosystem management that is especially applicable to aquatic systems. The

  18. Integrated water assessment and modelling: A bibliometric analysis of trends in the water resource sector

    NASA Astrophysics Data System (ADS)

    Zare, Fateme; Elsawah, Sondoss; Iwanaga, Takuya; Jakeman, Anthony J.; Pierce, Suzanne A.

    2017-09-01

    There are substantial challenges facing humanity in the water and related sectors and purposeful integration of the disciplines, connected sectors and interest groups is now perceived as essential to address them. This article describes and uses bibliometric analysis techniques to provide quantitative insights into the general landscape of Integrated Water Resource Assessment and Modelling (IWAM) research over the last 45 years. Keywords, terms in titles, abstracts and the full texts are used to distinguish the 13,239 IWAM articles in journals and other non-grey literature. We identify the major journals publishing IWAM research, influential authors through citation counts, as well as the distribution and strength of source countries. Fruitfully, we find that the growth in numbers of such publications has continued to accelerate, and attention to both the biophysical and socioeconomic aspects has also been growing. On the other hand, our analysis strongly indicates that the former continue to dominate, partly by embracing integration with other biophysical sectors related to water - environment, groundwater, ecology, climate change and agriculture. In the social sciences the integration is occurring predominantly through economics, with the others, including law, policy and stakeholder participation, much diminished in comparison. We find there has been increasing attention to management and decision support systems, but a much weaker focus on uncertainty, a pervasive concern whose criticalities must be identified and managed for improving decision making. It would seem that interdisciplinary science still has a long way to go before crucial integration with the non-economic social sciences and uncertainty considerations are achieved more routinely.

  19. Fuzzy logic knowledge bases in integrated landscape assessment: examples and possibilities.

    Treesearch

    Keith M. Reynolds

    2001-01-01

    The literature on ecosystem management has articulated the need for integration across disciplines and spatial scales, but convincing demonstrations of integrated analysis to support ecosystem management are lacking. This paper focuses on integrated ecological assessment because ecosystem management fundamentally is concerned with integrated management, which...

  20. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  1. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  2. Assessment of work-integrated learning: comparison of the usage of a grading rubric by supervising radiographers and teachers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilgour, Andrew J, E-mail: akilgour@csu.edu.au; Kilgour, Peter W; Gerzina, Tania

    Introduction: Professional work-integrated learning (WIL) that integrates the academic experience with off-campus professional experience placements is an integral part of many tertiary courses. Issues with the reliability and validity of assessment grades in these placements suggest that there is a need to strengthen the level of academic rigour of placements in these programmes. This study aims to compare the attitudes to the usage of assessment rubrics of radiographers supervising medical imaging students and teachers supervising pre-service teachers. Methods: WIL placement assessment practices in two programmes, pre-service teacher training (Avondale College of Higher Education, NSW) and medical diagnostic radiography (Faculty ofmore » Health Sciences, University of Sydney, NSW), were compared with a view to comparing assessment strategies across these two different educational domains. Educators (course coordinators) responsible for teaching professional development placements of teacher trainees and diagnostic radiography students developed a standards-based grading rubric designed to guide assessment of students’ work during WIL placement by assessors. After ∼12 months of implementation of the rubrics, assessors’ reaction to the effectiveness and usefulness of the grading rubric was determined using a specially created survey form. Data were collected over the period from March to June 2011. Quantitative and qualitative data found that assessors in both programmes considered the grading rubric to be a vital tool in the assessment process, though teacher supervisors were more positive about the benefits of its use than the radiographer supervisors. Results: Benefits of the grading rubric included accuracy and consistency of grading, ability to identify specific areas of desired development and facilitation of the provision of supervisor feedback. The use of assessment grading rubrics is of benefit to assessors in WIL placements from two very different

  3. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    ERIC Educational Resources Information Center

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  4. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  5. Integrating Exposure into Chemical Alternatives Assessment ...

    EPA Pesticide Factsheets

    Most alternatives assessments (AA) published to date are largely hazard-based rankings, and as such may not represent a fully informed consideration of the advantages and disadvantages of possible alternatives. With an assessment goal of identifying an alternative chemical that is more sustainable, other attributes beyond hazard are also important, including exposure, risk, life-cycle impacts, performance, cost, and social responsibility. Building on the 2014 recommendations by the U.S. National Academy of Sciences to improve AA decisions by including comparative exposure assessment, the HESISustainable Chemical Alternatives Technical Committee, which consists of scientists from academia, industry, government, and NGOs, has developed a qualitative comparative exposure approach. Conducting such a comparison can screen for alternatives that are expected to have a higher exposure potential, which could trigger a higher-tiered, more quantitative exposure assessment on the alternatives being considered. This talk will demonstrate an approach for including chemical- and product-related exposure information in a qualitative AA comparison. Starting from existing hazard AAs, a series of four chemical-product application scenarios were examined to test the concept, to understand the effort required, and to determine the value of exposure data in AA decision-making. The group has developed a classification approach for ingredient and product parameters to support compariso

  6. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment. © The Author 2015. Published by Oxford University Press on behalf of

  7. Assessing Vermont's stream health and biological integrity using artificial neural networks and Bayesian methods

    NASA Astrophysics Data System (ADS)

    Rizzo, D. M.; Fytilis, N.; Stevens, L.

    2012-12-01

    Environmental managers are increasingly required to monitor and forecast long-term effects and vulnerability of biophysical systems to human-generated stresses. Ideally, a study involving both physical and biological assessments conducted concurrently (in space and time) could provide a better understanding of the mechanisms and complex relationships. However, costs and resources associated with monitoring the complex linkages between the physical, geomorphic and habitat conditions and the biological integrity of stream reaches are prohibitive. Researchers have used classification techniques to place individual streams and rivers into a broader spatial context (hydrologic or health condition). Such efforts require environmental managers to gather multiple forms of information - quantitative, qualitative and subjective. We research and develop a novel classification tool that combines self-organizing maps with a Naïve Bayesian classifier to direct resources to stream reaches most in need. The Vermont Agency of Natural Resources has developed and adopted protocols for physical stream geomorphic and habitat assessments throughout the state of Vermont. Separate from these assessments, the Vermont Department of Environmental Conservation monitors the biological communities and the water quality in streams. Our initial hypothesis is that the geomorphic reach assessments and water quality data may be leveraged to reduce error and uncertainty associated with predictions of biological integrity and stream health. We test our hypothesis using over 2500 Vermont stream reaches (~1371 stream miles) assessed by the two agencies. In the development of this work, we combine a Naïve Bayesian classifier with a modified Kohonen Self-Organizing Map (SOM). The SOM is an unsupervised artificial neural network that autonomously analyzes inherent dataset properties using input data only. It is typically used to cluster data into similar categories when a priori classes do not exist. The

  8. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  9. Promoting the safety performance of industrial radiography using a quantitative assessment system.

    PubMed

    Kardan, M R; Mianji, F A; Rastkhah, N; Babakhani, A; Azad, S Borhan

    2006-12-01

    The increasing number of industrial radiographers and their considerable occupational exposure has been one of the main concerns of the Iran Nuclear Regulatory Authority (INRA) in recent years. In 2002, a quantitative system of evaluating the safety performance of licensees and a complementary enforcement system was introduced by the National Radiation Protection Department (NRPD). Each parameter of the practice is given a weighting factor according to its importance to safety. Assessment of the licensees is done quantitatively by summing up their scores using prepared tables. Implementing this system of evaluation showed a considerable decrease in deficiencies in the various centres. Tables are updated regularly as a result of findings during the inspections. This system is used in addition to enforcement to promote safety performance and to increase the culture of safety in industrial radiography.

  10. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    PubMed

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  11. Integrated Science Assessment (ISA) for Sulfur Oxides ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Integrated Science Assessment (ISA) for Sulfur Oxides – Health Criteria final assessment. This report represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding whether the current standard for oxides of sulfur (SO2) sufficiently protects public health. The Integrated Plan for Review of the Primary NAAQS for SOx U.S. 2: EPA (2007) identifies key policy-relevant questions that provide a framework for this review of the scientific evidence. These questions frame the entire review of the NAAQS, and thus are informed by both science and policy considerations. The ISA organizes and presents the scientific evidence such that, when considered along with findings from risk analyses and policy considerations, will help the EPA address these questions in completing the NAAQS review.

  12. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  13. Quantitative optical frequency domain imaging assessment of in-stent structures in patients with ST-segment elevation myocardial infarction: impact of imaging sampling rate.

    PubMed

    Muramatsu, Takashi; García-García, Hector M; Lee, Il Soo; Bruining, Nico; Onuma, Yoshinobu; Serruys, Patrick W

    2012-01-01

    The impact of the sampling rate (SR) of optical frequency domain imaging (OFDI) on quantitative assessment of in-stent structures (ISS) such as plaque prolapse and thrombus remains unexplored. OFDI after stenting was performed in ST-segment elevation myocardial infarction (STEMI) patients using a TERUMO OFDI system (Terumo Europe, Leuven, Belgium) with 160 frames/s and pullback speed of 20 mm/s. A total of 126 stented segments were analyzed. ISS were classified as either attached or non-attached to stent area boundaries. The volume, mean area and largest area of ISS were assessed according to 4 frequencies of SR, corresponding to distances between the analyzed frames of 0.125, 0.25, 0.50 and 1.0 mm. ISS volume was calculated by integrating cross-sectional ISS areas multiplied by each sampling distance using the disk summation method. The volume and mean area of ISS became significantly larger, while the largest area became significantly smaller as sampling distance became larger (1.11 mm(2) for 0.125 mm vs. 1.00 mm(2) for 1.0 mm, P for trend=0.036). In addition, variance of difference was positively associated with increasing width of sampling distance. Quantification of ISS is significantly influenced by the applied frequency of SR. This should be taken into account when designing future OFDI studies in which quantitative assessment of ISS is critical for the evaluation of STEMI patients.

  14. Using category theory to assess the relationship between consciousness and integrated information theory.

    PubMed

    Tsuchiya, Naotsugu; Taguchi, Shigeru; Saigo, Hayato

    2016-06-01

    One of the most mysterious phenomena in science is the nature of conscious experience. Due to its subjective nature, a reductionist approach is having a hard time in addressing some fundamental questions about consciousness. These questions are squarely and quantitatively tackled by a recently developed theoretical framework, called integrated information theory (IIT) of consciousness. In particular, IIT proposes that a maximally irreducible conceptual structure (MICS) is identical to conscious experience. However, there has been no principled way to assess the claimed identity. Here, we propose to apply a mathematical formalism, category theory, to assess the proposed identity and suggest that it is important to consider if there exists a proper translation between the domain of conscious experience and that of the MICS. If such translation exists, we postulate that questions in one domain can be answered in the other domain; very difficult questions in the domain of consciousness can be resolved in the domain of mathematics. We claim that it is possible to empirically test if such a functor exists, by using a combination of neuroscientific and computational approaches. Our general, principled and empirical framework allows us to assess the relationship between the domain of consciousness and the domain of mathematical structures, including those suggested by IIT. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  16. Integrating ethics in health technology assessment: many ways to Rome.

    PubMed

    Hofmann, Björn; Oortwijn, Wija; Bakke Lysdahl, Kristin; Refolo, Pietro; Sacchini, Dario; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2015-01-01

    The aim of this study was to identify and discuss appropriate approaches to integrate ethical inquiry in health technology assessment (HTA). The key question is how ethics can be integrated in HTA. This is addressed in two steps: by investigating what it means to integrate ethics in HTA, and by assessing how suitable the various methods in ethics are to be integrated in HTA according to these meanings of integration. In the first step, we found that integrating ethics can mean that ethics is (a) subsumed under or (b) combined with other parts of the HTA process; that it can be (c) coordinated with other parts; or that (d) ethics actively interacts and changes other parts of the HTA process. For the second step, we found that the various methods in ethics have different merits with respect to the four conceptions of integration in HTA. Traditional approaches in moral philosophy tend to be most suited to be subsumed or combined, while processual approaches being close to the HTA or implementation process appear to be most suited to coordinated and interactive types of integration. The article provides a guide for choosing the ethics approach that appears most appropriate for the goals and process of a particular HTA.

  17. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  18. Developing integrative primary healthcare delivery: adding a chiropractor to the team.

    PubMed

    Garner, Michael J; Birmingham, Michael; Aker, Peter; Moher, David; Balon, Jeff; Keenan, Dirk; Manga, Pran

    2008-01-01

    The use of complementary and alternative medicine has been increasing in Canada despite the lack of coverage under the universal public health insurance system. Physicians and other healthcare practitioners are now being placed in multidisciplinary teams, yet little research on integration exists. We sought to investigate the effect of integrating chiropractic on the attitudes of providers on two healthcare teams. A mixed methods design with both quantitative and qualitative components was used to assess the healthcare teams. Assessment occurred prior to integration, at midstudy, and at the end of the study (18 months). Multidisciplinary healthcare teams at two community health centers in Ottawa, Ontario, participated in the study. All physicians, nurse practitioners, and degree-trained nurses employed at two study sites were approached to take part in the study. A chiropractor was introduced into each of the two healthcare teams. A quantitative questionnaire assessed providers' opinions, experiences with collaboration, and perceptions of chiropractic care. Focus groups were used to encourage providers to communicate their experiences and perceptions of the integration and of chiropractic. Twelve providers were followed for the full 18 months of integration. The providers expressed increased willingness to trust the chiropractors in shared care (F value = 7.18; P = .004). Questions regarding the legitimacy (F value = 12.33; P < .001) and effectiveness (F value = 11.17; P < .001) of chiropractic became increasingly positive by study end. This project has demonstrated the successful integration of chiropractors into primary healthcare teams.

  19. Computer-Assisted Quantitative Assessment of Prostatic Calcifications in Patients with Chronic Prostatitis.

    PubMed

    Boltri, Matteo; Magri, Vittorio; Montanari, Emanuele; Perletti, Gianpaolo; Trinchieri, Alberto

    2018-04-26

    The aim of this study was the development of quantitative assessment of prostatic calcifications at prostatic ultrasound examination by the use of an image analyzer. A group of 82 patients was evaluated by medical history, physical, and transrectal ultrasound examination. Patients had a urethral swab, a 4-specimen study and culture of the seminal fluid. Patients were classified according to National Institute of Diabetes and Digestive and Kidney Diseases/National Institutes of Health. Subjective symptoms were scored by Chronic Prostatitis Symptom Index (CPSI) questionnaire. Ultrasound images were analyzed by the digital processing software Image J to quantitatively assess the presence of calcifications. Computer-assessed calcified areas were significantly higher in chronic bacterial prostatitis (n = 18; group II; 6.76 ± 8.09%) than in the chronic pelvic pain syndrome group IIIa (n = 26; 2.07 ± 1.01%) and IIIb (n = 38; 2.31 ± 2.18%). The area of calcification of the prostate was significantly related to the CPSI score for domains of micturition (r = 0.278, p = 0.023), Prostatic Specific Antigen values (r = 0341, p = 0.005), postvoiding residual urine (r = 0.262, p = 0.032), total prostate volume (r = 0.592, p = 0.000), and adenoma volume (r = 0.593; p = 0.000). The presence of calcifications is more frequently observed in patients with chronic bacterial prostatitis and is related to urinary symptoms. © 2018 S. Karger AG, Basel.

  20. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  1. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  2. A meta-classifier for detecting prostate cancer by quantitative integration of in vivo magnetic resonance spectroscopy and magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Viswanath, Satish; Tiwari, Pallavi; Rosen, Mark; Madabhushi, Anant

    2008-03-01

    Recently, in vivo Magnetic Resonance Imaging (MRI) and Magnetic Resonance Spectroscopy (MRS) have emerged as promising new modalities to aid in prostate cancer (CaP) detection. MRI provides anatomic and structural information of the prostate while MRS provides functional data pertaining to biochemical concentrations of metabolites such as creatine, choline and citrate. We have previously presented a hierarchical clustering scheme for CaP detection on in vivo prostate MRS and have recently developed a computer-aided method for CaP detection on in vivo prostate MRI. In this paper we present a novel scheme to develop a meta-classifier to detect CaP in vivo via quantitative integration of multimodal prostate MRS and MRI by use of non-linear dimensionality reduction (NLDR) methods including spectral clustering and locally linear embedding (LLE). Quantitative integration of multimodal image data (MRI and PET) involves the concatenation of image intensities following image registration. However multimodal data integration is non-trivial when the individual modalities include spectral and image intensity data. We propose a data combination solution wherein we project the feature spaces (image intensities and spectral data) associated with each of the modalities into a lower dimensional embedding space via NLDR. NLDR methods preserve the relationships between the objects in the original high dimensional space when projecting them into the reduced low dimensional space. Since the original spectral and image intensity data are divorced from their original physical meaning in the reduced dimensional space, data at the same spatial location can be integrated by concatenating the respective embedding vectors. Unsupervised consensus clustering is then used to partition objects into different classes in the combined MRS and MRI embedding space. Quantitative results of our multimodal computer-aided diagnosis scheme on 16 sets of patient data obtained from the ACRIN trial, for which

  3. Characteristics of liver fibrosis with different etiologies using a fully quantitative fibrosis assessment tool.

    PubMed

    Wu, Q; Zhao, X; You, H

    2017-05-18

    This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver

  4. Characteristics of liver fibrosis with different etiologies using a fully quantitative fibrosis assessment tool

    PubMed Central

    Wu, Q.; Zhao, X.; You, H.

    2017-01-01

    This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver

  5. From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Klose, M.; Damm, B.

    2014-12-01

    The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their

  6. How to assess and prepare health systems in low- and middle-income countries for integration of services-a systematic review.

    PubMed

    Topp, Stephanie M; Abimbola, Seye; Joshi, Rohina; Negin, Joel

    2018-03-01

    Despite growing support for integration of frontline services, a lack of information about the pre-conditions necessary to integrate such services hampers the ability of policy makers and implementers to assess how feasible or worthwhile integration may be, especially in low- and middle-income countries (LMICs). We adopted a modified systematic review with aspects of realist review, including quantitative and qualitative studies that incorporated assessment of health system preparedness for and capacity to implement integrated services. We searched Medline via Ovid, Web of Science and the Cochrane library using terms adapted from Dudley and Garner's systematic review on integration in LMICs. From an initial list of 10 550 articles, 206 were selected for full-text review by two reviewers who independently reviewed articles and inductively extracted and synthesized themes related to health system preparedness. We identified five 'context' related categories and four health system 'capability' themes. The contextual enabling and constraining factors for frontline service integration were: (1) the organizational framework of frontline services, (2) health care worker preparedness, (3) community and client preparedness, (4) upstream logistics and (5) policy and governance issues. The intersecting health system capabilities identified were the need for: (1) sufficiently functional frontline health services, (2) sufficiently trained and motivated health care workers, (3) availability of technical tools and equipment suitable to facilitate integrated frontline services and (4) appropriately devolved authority and decision-making processes to enable frontline managers and staff to adapt integration to local circumstances. Moving beyond claims that integration is defined differently by different programs and thus unsuitable for comparison, this review demonstrates that synthesis is possible. It presents a common set of contextual factors and health system capabilities

  7. Supply-side dimensions and dynamics of integrating HIV testing and counselling into routine antenatal care: a facility assessment from Morogoro Region, Tanzania.

    PubMed

    An, Selena J; George, Asha S; LeFevre, Amnesty E; Mpembeni, Rose; Mosha, Idda; Mohan, Diwakar; Yang, Ann; Chebet, Joy; Lipingu, Chrisostom; Baqui, Abdullah H; Killewo, Japhet; Winch, Peter J; Kilewo, Charles

    2015-10-04

    Integration of HIV into RMNCH (reproductive, maternal, newborn and child health) services is an important process addressing the disproportionate burden of HIV among mothers and children in sub-Saharan Africa. We assess the structural inputs and processes of care that support HIV testing and counselling in routine antenatal care to understand supply-side dynamics critical to scaling up further integration of HIV into RMNCH services prior to recent changes in HIV policy in Tanzania. This study, as a part of a maternal and newborn health program evaluation in Morogoro Region, Tanzania, drew from an assessment of health centers with 18 facility checklists, 65 quantitative and 57 qualitative provider interviews, and 203 antenatal care observations. Descriptive analyses were performed with quantitative data using Stata 12.0, and qualitative data were analyzed thematically with data managed by Atlas.ti. Limitations in structural inputs, such as infrastructure, supplies, and staffing, constrain the potential for integration of HIV testing and counselling into routine antenatal care services. While assessment of infrastructure, including waiting areas, appeared adequate, long queues and small rooms made private and confidential HIV testing and counselling difficult for individual women. Unreliable stocks of HIV test kits, essential medicines, and infection prevention equipment also had implications for provider-patient relationships, with reported decreases in women's care seeking at health centers. In addition, low staffing levels were reported to increase workloads and lower motivation for health workers. Despite adequate knowledge of counselling messages, antenatal counselling sessions were brief with incomplete messages conveyed to pregnant women. In addition, coping mechanisms, such as scheduling of clinical activities on different days, limited service availability. Antenatal care is a strategic entry point for the delivery of critical tests and counselling messages

  8. Reference charts for young stands — a quantitative methodology for assessing tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Benjamin O. Knapp; John M. Kabrick; Daniel C. Dey

    2017-01-01

    Reference charts have long been used in the medical field for quantitative clinical assessment of juvenile development by plotting distribution quantiles for a selected attribute (e.g., height) against age for specified peer populations.We propose that early stand dynamics is an area of study that could benefit from the descriptions and analyses offered by similar...

  9. Quantitative relationships between different injury factors and development of brown rot caused by Monilinia fructigena in integrated and organic apple orchards.

    PubMed

    Holb, I J; Scherm, H

    2008-01-01

    . pomonella. At the harvest assessment, two additional significant correlations were between brown rot and bird injury and between brown rot and growth cracks. In every case, correlation coefficients were larger in organic than in integrated blocks. Although it is well established that brown rot in pome fruits is closely associated with fruit injuries, this is the first study to provide season-long progress data on different injury types and quantitative analyses of their relative importance at different times in the growing season and across two distinct management systems.

  10. Ecotoxicological assessment of flocculant modified soil for lake restoration using an integrated biotic toxicity index.

    PubMed

    Wang, Zhibin; Zhang, Honggang; Pan, Gang

    2016-06-15

    Flocculant modified soils/clays are being increasingly studied as geo-engineering materials for lake restoration and harmful algal bloom control. However, the potential impacts of adding these materials in aquatic ecological systems remain unclear. This study investigated the potential effects of chitosan, cationic starch, chitosan modified soils (MS-C) and cationic starch modified soils (MS-S) on the aquatic organisms by using a bioassay battery. The toxicity potential of these four flocculants was quantitatively assessed using an integrated biotic toxicity index (BTI). The test system includes four aquatic species, namely Chlorella vulgaris, Daphnia magna, Cyprinus carpio and Limnodrilus hoffmeisteri, which represent four trophic levels in the freshwater ecosystem. Results showed that median effect concentrations (EC50) of the MS-C and MS-S were 31-124 times higher than chitosan and cationic starch, respectively. D. magna was the most sensitive species to the four flocculants. Histological examination of C. carpio showed that significant pathological changes were found in gills. Different from chitosan and cationic starch, MS-C and MS-S significantly alleviated the acute toxicities of chitosan and cationic starch. The toxicity order of the four flocculants based on BTI were cationic starch > chitosan > MS-S > MS-C. The results suggested that BTI can be used as a quantitative and comparable indicator to assess biotic toxicity for aquatic geo-engineering materials. Chitosan or cationic starch modified soil/clay materials can be used at their optimal dosage without causing substantial adverse effects to the bioassay battery in aquatic ecosystem. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Community assessment in a vertically integrated health care system.

    PubMed Central

    Plescia, M; Koontz, S; Laurent, S

    2001-01-01

    OBJECTIVES: In this report, the authors present a representative case of the implementation of community assessment and the subsequent application of findings by a large, vertically integrated health care system. METHODS: Geographic information systems technology was used to access and analyze secondary data for a geographically defined community. Primary data included a community survey and asset maps. RESULTS: In this case presentation, information has been collected on demographics, prevalent health problems, access to health care, citizens' perceptions, and community assets. The assessment has been used to plan services for a new health center and to engage community members in health promotion interventions. CONCLUSIONS: Geographically focused assessments help target specific community needs and promote community participation. This project provides a practical application for integrating aspects of medicine and public health. PMID:11344895

  12. SEEING THE LIGHT: A WATER CLARITY INDEX FOR INTEGRATED WATER QUALITY ASSESSMENTS

    EPA Science Inventory

    Smith, Lisa M. and Linda C. Harwell. In press. Seeing the Light: A Water Clarity Index for Integrated Water Quality Assessments (Abstract). To be presented at EMAP Symposium 2004: Integrated Monitoring & Assessment for Effective Water Quality Management. 1 p. (ERL,GB R970).
    <...

  13. Integrated quantitative phase and birefringence microscopy for imaging malaria-infected red blood cells.

    PubMed

    Li, Chengshuai; Chen, Shichao; Klemba, Michael; Zhu, Yizheng

    2016-09-01

    A dual-modality birefringence/phase imaging system is presented. The system features a crystal retarder that provides polarization mixing and generates two interferometric carrier waves in a single signal spectrum. The retardation and orientation of sample birefringence can then be measured simultaneously based on spectral multiplexing interferometry. Further, with the addition of a Nomarski prism, the same setup can be used for quantitative differential interference contrast (DIC) imaging. Sample phase can then be obtained with two-dimensional integration. In addition, birefringence-induced phase error can be corrected using the birefringence data. This dual-modality approach is analyzed theoretically with Jones calculus and validated experimentally with malaria-infected red blood cells. The system generates not only corrected DIC and phase images, but a birefringence map that highlights the distribution of hemozoin crystals.

  14. Integrated quantitative phase and birefringence microscopy for imaging malaria-infected red blood cells

    NASA Astrophysics Data System (ADS)

    Li, Chengshuai; Chen, Shichao; Klemba, Michael; Zhu, Yizheng

    2016-09-01

    A dual-modality birefringence/phase imaging system is presented. The system features a crystal retarder that provides polarization mixing and generates two interferometric carrier waves in a single signal spectrum. The retardation and orientation of sample birefringence can then be measured simultaneously based on spectral multiplexing interferometry. Further, with the addition of a Nomarski prism, the same setup can be used for quantitative differential interference contrast (DIC) imaging. Sample phase can then be obtained with two-dimensional integration. In addition, birefringence-induced phase error can be corrected using the birefringence data. This dual-modality approach is analyzed theoretically with Jones calculus and validated experimentally with malaria-infected red blood cells. The system generates not only corrected DIC and phase images, but a birefringence map that highlights the distribution of hemozoin crystals.

  15. Embedding of Authentic Assessment in Work-Integrated Learning Curriculum

    ERIC Educational Resources Information Center

    Bosco, Anna Maria; Ferns, Sonia

    2014-01-01

    Contemporary perspectives of higher education endorse a work integrated learning (WIL) approach to curriculum content, delivery and assessment. It is agreed that authenticity in learning relates to real-world experience, however, differentiating and strategically linking WIL provision and facilitation to assessment tasks and collation of authentic…

  16. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    DOE PAGES

    Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less

  17. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  18. Integrating Science and Management to Assess Forest Ecosystem Vulnerability to Climate Change

    Treesearch

    Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; Maria K. Janowiak; P. Danielle Shannon; Christopher W. Swanston

    2017-01-01

    We developed the ecosystem vulnerability assessment approach (EVAA) to help inform potential adaptation actions in response to a changing climate. EVAA combines multiple quantitative models and expert elicitation from scientists and land managers. In each of eight assessment areas, a panel of local experts determined potential vulnerability of forest ecosystems to...

  19. Non-animal assessment of skin sensitization hazard: Is an integrated testing strategy needed, and if so what should be integrated?

    PubMed

    Roberts, David W; Patlewicz, Grace

    2018-01-01

    There is an expectation that to meet regulatory requirements, and avoid or minimize animal testing, integrated approaches to testing and assessment will be needed that rely on assays representing key events (KEs) in the skin sensitization adverse outcome pathway. Three non-animal assays have been formally validated and regulatory adopted: the direct peptide reactivity assay (DPRA), the KeratinoSens™ assay and the human cell line activation test (h-CLAT). There have been many efforts to develop integrated approaches to testing and assessment with the "two out of three" approach attracting much attention. Here a set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the three individual non-animal assays, their binary combinations and the "two out of three" approach in predicting skin sensitization potential. The most predictive approach was to use both the DPRA and h-CLAT as follows: (1) perform DPRA - if positive, classify as sensitizing, and (2) if negative, perform h-CLAT - a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 85% (local lymph node assay) and 93% (human) of non-sensitizer predictions were correct, whereas the "two out of three" approach had 69% (local lymph node assay) and 79% (human) of non-sensitizer predictions correct. The findings are consistent with the argument, supported by published quantitative mechanistic models that only the first KE needs to be modeled. All three assays model this KE to an extent. The value of using more than one assay depends on how the different assays compensate for each other's technical limitations. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. ECITE: A Testbed for Assessment of Technology Interoperability and Integration wiht Architecture Components

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.

    2016-12-01

    ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.

  1. GLIMPSE: An integrated assessment model-based tool for ...

    EPA Pesticide Factsheets

    Dan Loughlin will describe the GCAM-USA integrated assessment model and how that model is being improved and integrated into the GLIMPSE decision support system. He will also demonstrate the application of the model to evaluate the emissions and health implications of hypothetical state-level renewable electricity standards. Introduce the GLIMPSE project to state and regional environmental modelers and analysts. Presented as part of the State Energy and Air Quality Group Webinar Series, which is organized by NESCAUM.

  2. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  3. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  4. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  5. Overview of integrative tools and methods in assessing ecological integrity in estuarine and coastal systems worldwide.

    PubMed

    Borja, Angel; Bricker, Suzanne B; Dauer, Daniel M; Demetriades, Nicolette T; Ferreira, João G; Forbes, Anthony T; Hutchings, Pat; Jia, Xiaoping; Kenchington, Richard; Carlos Marques, João; Zhu, Changbo

    2008-09-01

    In recent years, several sets of legislation worldwide (Oceans Act in USA, Australia or Canada; Water Framework Directive or Marine Strategy in Europe, National Water Act in South Africa, etc.) have been developed in order to address ecological quality or integrity, within estuarine and coastal systems. Most such legislation seeks to define quality in an integrative way, by using several biological elements, together with physico-chemical and pollution elements. Such an approach allows assessment of ecological status at the ecosystem level ('ecosystem approach' or 'holistic approach' methodologies), rather than at species level (e.g. mussel biomonitoring or Mussel Watch) or just at chemical level (i.e. quality objectives) alone. Increasing attention has been paid to the development of tools for different physico-chemical or biological (phytoplankton, zooplankton, benthos, algae, phanerogams, fishes) elements of the ecosystems. However, few methodologies integrate all the elements into a single evaluation of a water body. The need for such integrative tools to assess ecosystem quality is very important, both from a scientific and stakeholder point of view. Politicians and managers need information from simple and pragmatic, but scientifically sound methodologies, in order to show to society the evolution of a zone (estuary, coastal area, etc.), taking into account human pressures or recovery processes. These approaches include: (i) multidisciplinarity, inherent in the teams involved in their implementation; (ii) integration of biotic and abiotic factors; (iii) accurate and validated methods in determining ecological integrity; and (iv) adequate indicators to follow the evolution of the monitored ecosystems. While some countries increasingly use the establishment of marine parks to conserve marine biodiversity and ecological integrity, there is awareness (e.g. in Australia) that conservation and management of marine ecosystems cannot be restricted to Marine Protected

  6. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more

  7. Integrated genomics and molecular breeding approaches for dissecting the complex quantitative traits in crop plants.

    PubMed

    Kujur, Alice; Saxena, Maneesha S; Bajaj, Deepak; Laxmi; Parida, Swarup K

    2013-12-01

    The enormous population growth, climate change and global warming are now considered major threats to agriculture and world's food security. To improve the productivity and sustainability of agriculture, the development of highyielding and durable abiotic and biotic stress-tolerant cultivars and/climate resilient crops is essential. Henceforth, understanding the molecular mechanism and dissection of complex quantitative yield and stress tolerance traits is the prime objective in current agricultural biotechnology research. In recent years, tremendous progress has been made in plant genomics and molecular breeding research pertaining to conventional and next-generation whole genome, transcriptome and epigenome sequencing efforts, generation of huge genomic, transcriptomic and epigenomic resources and development of modern genomics-assisted breeding approaches in diverse crop genotypes with contrasting yield and abiotic stress tolerance traits. Unfortunately, the detailed molecular mechanism and gene regulatory networks controlling such complex quantitative traits is not yet well understood in crop plants. Therefore, we propose an integrated strategies involving available enormous and diverse traditional and modern -omics (structural, functional, comparative and epigenomics) approaches/resources and genomics-assisted breeding methods which agricultural biotechnologist can adopt/utilize to dissect and decode the molecular and gene regulatory networks involved in the complex quantitative yield and stress tolerance traits in crop plants. This would provide clues and much needed inputs for rapid selection of novel functionally relevant molecular tags regulating such complex traits to expedite traditional and modern marker-assisted genetic enhancement studies in target crop species for developing high-yielding stress-tolerant varieties.

  8. Assessing and predicting drug-induced anticholinergic risks: an integrated computational approach.

    PubMed

    Xu, Dong; Anderson, Heather D; Tao, Aoxiang; Hannah, Katia L; Linnebur, Sunny A; Valuck, Robert J; Culbertson, Vaughn L

    2017-11-01

    Anticholinergic (AC) adverse drug events (ADEs) are caused by inhibition of muscarinic receptors as a result of designated or off-target drug-receptor interactions. In practice, AC toxicity is assessed primarily based on clinician experience. The goal of this study was to evaluate a novel concept of integrating big pharmacological and healthcare data to assess clinical AC toxicity risks. AC toxicity scores (ATSs) were computed using drug-receptor inhibitions identified through pharmacological data screening. A longitudinal retrospective cohort study using medical claims data was performed to quantify AC clinical risks. ATS was compared with two previously reported toxicity measures. A quantitative structure-activity relationship (QSAR) model was established for rapid assessment and prediction of AC clinical risks. A total of 25 common medications, and 575,228 exposed and unexposed patients were analyzed. Our data indicated that ATS is more consistent with the trend of AC outcomes than other toxicity methods. Incorporating drug pharmacokinetic parameters to ATS yielded a QSAR model with excellent correlation to AC incident rate ( R 2 = 0.83) and predictive performance (cross validation Q 2 = 0.64). Good correlation and predictive performance ( R 2 = 0.68/ Q 2 = 0.29) were also obtained for an M2 receptor-specific QSAR model and tachycardia, an M2 receptor-specific ADE. Albeit using a small medication sample size, our pilot data demonstrated the potential and feasibility of a new computational AC toxicity scoring approach driven by underlying pharmacology and big data analytics. Follow-up work is under way to further develop the ATS scoring approach and clinical toxicity predictive model using a large number of medications and clinical parameters.

  9. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  10. Quantitative Assessment of Combination Antimicrobial Therapy against Multidrug-Resistant Acinetobacter baumannii▿

    PubMed Central

    Lim, Tze-Peng; Ledesma, Kimberly R.; Chang, Kai-Tai; Hou, Jing-Guo; Kwa, Andrea L.; Nikolaou, Michael; Quinn, John P.; Prince, Randall A.; Tam, Vincent H.

    2008-01-01

    Treatment of multidrug-resistant bacterial infections poses a therapeutic challenge to clinicians; combination therapy is often the only viable option for multidrug-resistant infections. A quantitative method was developed to assess the combined killing abilities of antimicrobial agents. Time-kill studies (TKS) were performed using a multidrug-resistant clinical isolate of Acinetobacter baumannii with escalating concentrations of cefepime (0 to 512 mg/liter), amikacin (0 to 256 mg/liter), and levofloxacin (0 to 64 mg/liter). The bacterial burden data in single and combined (two of the three agents with clinically achievable concentrations in serum) TKS at 24 h were mathematically modeled to provide an objective basis for comparing various antimicrobial agent combinations. Synergy and antagonism were defined as interaction indices of <1 and >1, respectively. A hollow-fiber infection model (HFIM) simulating various clinical (fluctuating concentrations over time) dosing exposures was used to selectively validate our quantitative assessment of the combined killing effect. Model fits in all single-agent TKS were satisfactory (r2 > 0.97). An enhanced combined overall killing effect was seen in the cefepime-amikacin combination (interactive index, 0.698; 95% confidence interval [CI], 0.675 to 0.722) and the cefepime-levofloxacin combination (interactive index, 0.929; 95% CI, 0.903 to 0.956), but no significant difference in the combined overall killing effect for the levofloxacin-amikacin combination was observed (interactive index, 0.994; 95% CI, 0.982 to 1.005). These assessments were consistent with observations in HFIM validation studies. Our method could be used to objectively rank the combined killing activities of two antimicrobial agents when used together against a multidrug-resistant A. baumannii isolate. It may offer better insights into the effectiveness of various antimicrobial combinations and warrants further investigations. PMID:18505848

  11. Integrating human health and ecological concerns in risk assessments.

    PubMed

    Cirone, P A; Bruce Duncan, P

    2000-11-03

    The interconnections between ecosystems, human health and welfare have been increasingly recognized by the US government, academia, and the public. This paper continues this theme by addressing the use of risk assessment to integrate people into a single assessment. In a broad overview of the risk assessment process we stress the need to build a conceptual model of the whole system including multiple species (humans and other ecological entities), stressors, and cumulative effects. We also propose converging landscape ecology and evaluation of ecosystem services with risk assessment to address these cumulative responses. We first look at how this integration can occur within the problem formulation step in risk assessment where the system is defined, a conceptual model created, a subset of components and functions selected, and the analytical framework decided in a context that includes the management decisions. A variety of examples of problem formulations (salmon, wild insects, hyporheic ecosystems, ultraviolet (UV) radiation, nitrogen fertilization, toxic chemicals, and oil spills) are presented to illustrate how treating humans as components of the landscape can add value to risk assessments. We conclude that the risk assessment process should help address the urgent needs of society in proportion to importance, to provide a format to communicate knowledge and understanding, and to inform policy and management decisions.

  12. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    USGS Publications Warehouse

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  13. MO-DE-303-03: Session on quantitative imaging for assessment of tumor response to radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, S.

    This session will focus on quantitative imaging for assessment of tumor response to radiation therapy. This is a technically challenging method to translate to practice in radiation therapy. In the new era of precision medicine, however, delivering the right treatment, to the right patient, and at the right time, can positively impact treatment choices and patient outcomes. Quantitative imaging provides the spatial sensitivity required by radiation therapy for precision medicine that is not available by other means. In this Joint ESTRO -AAPM Symposium, three leading-edge investigators will present specific motivations for quantitative imaging biomarkers in radiation therapy of esophageal, headmore » and neck, locally advanced non-small cell lung cancer, and hepatocellular carcinoma. Experiences with the use of dynamic contrast enhanced (DCE) MRI, diffusion- weighted (DW) MRI, PET/CT, and SPECT/CT will be presented. Issues covered will include: response prediction, dose-painting, timing between therapy and imaging, within-therapy biomarkers, confounding effects, normal tissue sparing, dose-response modeling, and association with clinical biomarkers and outcomes. Current information will be presented from investigational studies and clinical practice. Learning Objectives: Learn motivations for the use of quantitative imaging biomarkers for assessment of response to radiation therapy Review the potential areas of application in cancer therapy Examine the challenges for translation, including imaging confounds and paucity of evidence to date Compare exemplary examples of the current state of the art in DCE-MRI, DW-MRI, PET/CT and SPECT/CT imaging for assessment of response to radiation therapy Van der Heide: Research grants from the Dutch Cancer Society and the European Union (FP7) Bowen: RSNA Scholar grant.« less

  14. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  15. Integrated Assessment Model Evaluation

    NASA Astrophysics Data System (ADS)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  16. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  17. Quantitative MRI assessments of white matter in children treated for acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Helton, Kathleen J.; Li, Chin-Shang; Pui, Ching-Hon

    2005-04-01

    The purpose of this study was to use objective quantitative MR imaging methods to prospectively assess changes in the physiological structure of white matter during the temporal evolution of leukoencephalopathy (LE) in children treated for acute lymphoblastic leukemia. The longitudinal incidence, extent (proportion of white matter affect), and intensity (elevation of T1 and T2 relaxation rates) of LE was evaluated for 44 children. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and white matter, gray matter and CSF a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map (SOM). Quantitative T1 and T2 relaxation maps were generated using a nonlinear parametric optimization procedure to fit the corresponding multi-exponential models. A Cox proportional regression was performed to estimate the effect of intravenous methotrexate (IV-MTX) exposure on the development of LE followed by a generalized linear model to predict the probability of LE in new patients. Additional T-tests of independent samples were performed to assess differences in quantitative measures of extent and intensity at four different points in therapy. Higher doses and more courses of IV-MTX placed patients at a higher risk of developing LE and were associated with more intense changes affecting more of the white matter volume; many of the changes resolved after completion of therapy. The impact of these changes on neurocognitive functioning and quality of life in survivors remains to be determined.

  18. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  19. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  20. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  2. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    PubMed Central

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-01-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen. PMID:27090437

  3. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  4. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  5. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  6. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    PubMed

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  7. Ozone (O3) Standards - Integrated Science Assessments from Current Review

    EPA Pesticide Factsheets

    The integrated science assessment (ISA) is a comprehensive review, synthesis, and evaluation of the most policy-relevant science, including key science judgments that are important to inform the development of the risk and exposure assessments, and more.

  8. Application of a faith-based integration tool to assess mental and physical health interventions

    PubMed Central

    Saunders, Donna M.; Leak, Jean; Carver, Monique E.; Smith, Selina A.

    2017-01-01

    Background To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Methods Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. Results The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Conclusions Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed. PMID:29354795

  9. Application of a faith-based integration tool to assess mental and physical health interventions.

    PubMed

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  10. Integrating Qualitative and Quantitative Research in Organizations.

    DTIC Science & Technology

    1981-07-01

    Qualitative Researcher Qualitative research using the traditional case study was the most popular method during the early empirical investigations of...what is now known as qualitative methods (Van Maanen, 1979). Some researchers have recently argued that restricting case studies to exploratory work... phenomenological approaches at the subjective end of the continuum. A few researchers have suggested ways in which quantitative and

  11. Anatomic and Quantitative Temporal Bone CT for Preoperative Assessment of Branchio-Oto-Renal Syndrome.

    PubMed

    Ginat, D T; Ferro, L; Gluth, M B

    2016-12-01

    We describe the temporal bone computed tomography (CT) findings of an unusual case of branchio-oto-renal syndrome with ectopic ossicles that are partially located in the middle cranial fossa. We also describe quantitative temporal bone CT assessment pertaining to cochlear implantation in the setting of anomalous cochlear anatomy associated with this syndrome.

  12. Visualizing the Critique: Integrating Quantitative Reasoning with the Design Process

    ERIC Educational Resources Information Center

    Weinstein, Kathryn

    2017-01-01

    In the age of "Big Data," information is often quantitative in nature. The ability to analyze information through the sifting of data has been identified as a core competency for success in navigating daily life and participation in the contemporary workforce. This skill, known as Quantitative Reasoning (QR), is characterized by the…

  13. Climate change and coastal vulnerability assessment: Scenarios for integrated assessment

    USGS Publications Warehouse

    Nicholls, R.J.; Wong, P.P.; Burkett, V.; Woodroffe, C.D.; Hay, J.

    2008-01-01

    Coastal vulnerability assessments still focus mainly on sea-level rise, with less attention paid to other dimensions of climate change. The influence of non-climatic environmental change or socio-economic change is even less considered, and is often completely ignored. Given that the profound coastal changes of the twentieth century are likely to continue through the twenty-first century, this is a major omission, which may overstate the importance of climate change, and may also miss significant interactions of climate change with other non-climate drivers. To better support climate and coastal management policy development, more integrated assessments of climatic change in coastal areas are required, including the significant non-climatic changes. This paper explores the development of relevant climate and non-climate drivers, with an emphasis on the non-climate drivers. While these issues are applicable within any scenario framework, our ideas are illustrated using the widely used SRES scenarios, with both impacts and adaptation being considered. Importantly, scenario development is a process, and the assumptions that are made about future conditions concerning the coast need to be explicit, transparent and open to scientific debate concerning their realism and likelihood. These issues are generic across other sectors. ?? Integrated Research System for Sustainability Science and Springer 2008.

  14. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  15. Integrated Modeling for Environmental Assessment of Ecosystem Services

    EPA Science Inventory

    The U.S. Environmental Protection Agency uses environmental models to inform rulemaking and policy decisions at multiple spatial and temporal scales. In this study, several sophisticated modeling technologies are seamlessly integrated to facilitate a baseline assessment of the re...

  16. 76 FR 26284 - Draft Integrated Science Assessment for Lead (Pb)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9302-5; Docket ID No. EPA-HQ-ORD-2011-0051] Draft Integrated... the availability of a document titled, ``First External Review Draft Integrated Science Assessment for Lead'' (EPA/600/R-10/075A). This draft document was prepared by the National Center for Environmental...

  17. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  18. Evaluation of historical land cover, land use, and land-use change emissions in the GCAM integrated assessment model

    NASA Astrophysics Data System (ADS)

    Calvin, K. V.; Wise, M.; Kyle, P.; Janetos, A. C.; Zhou, Y.

    2012-12-01

    Integrated Assessment Models (IAMs) are often used as science-based decision-support tools for evaluating the consequences of climate and energy policies, and their use in this framework is likely to increase in the future. However, quantitative evaluation of these models has been somewhat limited for a variety of reasons, including data availability, data quality, and the inherent challenges in projections of societal values and decision-making. In this analysis, we identify and confront methodological challenges involved in evaluating the agriculture and land use component of the Global Change Assessment Model (GCAM). GCAM is a global integrated assessment model, linking submodules of the regionally disaggregated global economy, energy system, agriculture and land-use, terrestrial carbon cycle, oceans and climate. GCAM simulates supply, demand, and prices for energy and agricultural goods from 2005 to 2100 in 5-year increments. In each time period, the model computes the allocation of land across a variety of land cover types in 151 different regions, assuming that farmers maximize profits and that food demand is relatively inelastic. GCAM then calculates both emissions from land-use practices, and long-term changes in carbon stocks in different land uses, thus providing simulation information that can be compared to observed historical data. In this work, we compare GCAM results, both in recent historic and future time periods, to historical data sets. We focus on land use, land cover, land-use change emissions, and albedo.

  19. Integration of classroom science performance assessment tasks by participants of the Wisconsin Performance Assessment Development Project (WPADP)

    NASA Astrophysics Data System (ADS)

    Tonnis, Dorothy Ann

    The goals of this interpretive study were to examine selected Wisconsin science teachers' perceptions of teaching and learning science, to describe the scope of classroom performance assessment practices, and to gain an understanding of teachers' personal and professional experiences that influenced their belief systems of teaching, learning and assessment. The study was designed to answer the research questions: (1) How does the integration of performance assessment relate to the teachers' views of teaching and learning? (2) How are the selected teachers integrating performance assessment in their teaching? (3) What past personal and professional experiences have influenced teachers' attitudes and beliefs related to their classroom performance assessment practices? Purposeful sampling was used to select seven Wisconsin elementary, middle and high school science teachers who participated in the WPADP initiative from 1993-1995. Data collection methods included a Teaching Practices Inventory (TPI), semi-structured interviews, teacher developed portfolios, portfolio conferences, and classroom observations. Four themes and multiple categories emerged through data analysis to answer the research questions and to describe the results. Several conclusions were drawn from this research. First, science teachers who appeared to effectively integrate performance assessment, demonstrated transformational thinking in their attitudes and beliefs about teaching and learning science. In addition, these teachers viewed assessment and instructional practices as interdependent. Third, transformational teachers generally used well defined criteria to judge student work and made it public to the students. Transformational teachers provided students with real-world performance assessment tasks that were also learning events. Furthermore, student task responses informed the transformational teachers about effectiveness of instruction, students' complex thinking skills, quality of

  20. Quantitative assessment of upper extremities motor function in multiple sclerosis.

    PubMed

    Daunoraviciene, Kristina; Ziziene, Jurgita; Griskevicius, Julius; Pauk, Jolanta; Ovcinikova, Agne; Kizlaitiene, Rasa; Kaubrys, Gintaras

    2018-05-18

    Upper extremity (UE) motor function deficits are commonly noted in multiple sclerosis (MS) patients and assessing it is challenging because of the lack of consensus regarding its definition. Instrumented biomechanical analysis of upper extremity movements can quantify coordination with different spatiotemporal measures and facilitate disability rating in MS patients. To identify objective quantitative parameters for more accurate evaluation of UE disability and relate it to existing clinical scores. Thirty-four MS patients and 24 healthy controls (CG) performed a finger-to-nose test as fast as possible and, in addition, clinical evaluation kinematic parameters of UE were measured by using inertial sensors. Generally, a higher disability score was associated with an increase of several temporal parameters, like slower task performance. The time taken to touch their nose was longer when the task was fulfilled with eyes closed. Time to peak angular velocity significantly changed in MS patients (EDSS > 5.0). The inter-joint coordination significantly decreases in MS patients (EDSS 3.0-5.5). Spatial parameters indicated that maximal ROM changes were in elbow flexion. Our findings have revealed that spatiotemporal parameters are related to the UE motor function and MS disability level. Moreover, they facilitate clinical rating by supporting clinical decisions with quantitative data.

  1. The quantitative assessment of epicardial fat distribution on human hearts: Implications for epicardial electrophysiology.

    PubMed

    Mattson, Alexander R; Soto, Mario J; Iaizzo, Paul A

    2018-07-01

    Epicardial electrophysiological procedures rely on dependable interfacing with the myocardial tissue. For example, epicardial pacing systems must generate sustainable chronic pacing capture, while epicardial ablations must effectively deliver energy to the target hyper-excitable myocytes. The human heart has a significant adipose layer which may impede epicardial procedures. The objective of this study was to quantitatively assess the relative location of epicardial adipose on the human heart, to define locations where epicardial therapies might be performed successfully. We studied perfusion-fixed human hearts (n = 105) in multiple isolated planes including: left ventricular margin, diaphragmatic surface, and anterior right ventricle. Relative adipose distribution was quantitatively assessed via planar images, using a custom-generated image analysis algorithm. In these specimens, 76.7 ± 13.8% of the left ventricular margin, 72.7 ± 11.3% of the diaphragmatic surface, and 92.1 ± 8.7% of the anterior right margin were covered with superficial epicardial adipose layers. Percent adipose coverage significantly increased with age (P < 0.001) and history of coronary artery disease (P < 0.05). No significant relationships were identified between relative percent adipose coverage and gender, body weight or height, BMI, history of hypertension, and/or history of congestive heart failure. Additionally, we describe two-dimensional probability distributions of epicardial adipose coverage for each of the three analysis planes. In this study, we detail the quantitative assessment and probabilistic mapping of the distribution of superficial epicardial adipose on the adult human heart. These findings have implications relative to performing epicardial procedures and/or designing procedures or tools to successfully perform such treatments. Clin. Anat. 31:661-666, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  2. Quantitative instruments used to assess children's sense of smell: a review article.

    PubMed

    Moura, Raissa Gomes Fonseca; Cunha, Daniele Andrade; Gomes, Ana Carolina de Lima Gusmão; Silva, Hilton Justino da

    2014-01-01

    To systematically gather from the literature available the quantitative instruments used to assess the sense of smell in studies carried out with children. The present study included a survey in the Pubmed and Bireme platforms and in the databases of MedLine, Lilacs, regional SciELO and Web of Science, followed by selection and critical analysis of the articles found and chosen. We selected original articles related to the topic in question, conducted only with children in Portuguese, English, and Spanish. We excluded studies addressing other phases of human development, exclusively or concurrently with the pediatric population; studies on animals; literature review articles; dissertations; book chapters; case study articles; and editorials. A book report protocol was created for this study, including the following information: author, department, year, location, population/sample, age, purpose of the study, methods, and main results. We found 8,451 articles by typing keywords and identifiers. Out of this total, 5,928 were excluded by the title, 2,366 by the abstract, and 123 after we read the full text. Thus, 34 articles were selected, of which 28 were repeated in the databases, totalizing 6 articles analyzed in this review. We observed a lack of standardization of the quantitative instruments used to assess children's sense of smell, with great variability in the methodology of the tests, which reduces the effectiveness and reliability of the results.

  3. Semi-quantitative analysis of salivary gland scintigraphy in Sjögren's syndrome diagnosis: a first-line tool.

    PubMed

    Angusti, Tiziana; Pilati, Emanuela; Parente, Antonella; Carignola, Renato; Manfredi, Matteo; Cauda, Simona; Pizzigati, Elena; Dubreuil, Julien; Giammarile, Francesco; Podio, Valerio; Skanjeti, Andrea

    2017-09-01

    The aim of this study was the assessment of semi-quantified salivary gland dynamic scintigraphy (SGdS) parameters independently and in an integrated way in order to predict primary Sjögren's syndrome (pSS). Forty-six consecutive patients (41 females; age 61 ± 11 years) with sicca syndrome were studied by SGdS after injection of 200 MBq of pertechnetate. In sixteen patients, pSS was diagnosed, according to American-European Consensus Group criteria (AECGc). Semi-quantitative parameters (uptake (UP) and excretion fraction (EF)) were obtained for each gland. ROC curves were used to determine the best cut-off value. The area under the curve (AUC) was used to estimate the accuracy of each semi-quantitative analysis. To assess the correlation between scintigraphic results and disease severity, semi-quantitative parameters were plotted versus Sjögren's syndrome disease activity index (ESSDAI). A nomogram was built to perform an integrated evaluation of all the scintigraphic semi-quantitative data. Both UP and EF of salivary glands were significantly lower in pSS patients compared to those in non-pSS (p < 0.001). ROC curve showed significantly large AUC for both the parameters (p < 0.05). Parotid UP and submandibular EF, assessed by univariated and multivariate logistic regression, showed a significant and independent correlation with pSS diagnosis (p value <0.05). No correlation was found between SGdS semi-quantitative parameters and ESSDAI. The proposed nomogram accuracy was 87%. SGdS is an accurate and reproducible tool for the diagnosis of pSS. ESSDAI was not shown to be correlated with SGdS data. SGdS should be the first-line imaging technique in patients with suspected pSS.

  4. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Treesearch

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  5. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  6. Quantitative Assessment of Transportation Network Vulnerability with Dynamic Traffic Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat

    Transportation networks are critical to the social and economic function of nations. Given the continuing increase in the populations of cities throughout the world, the criticality of transportation infrastructure is expected to increase. Thus, it is ever more important to mitigate congestion as well as to assess the impact disruptions would have on individuals who depend on transportation for their work and livelihood. Moreover, several government organizations are responsible for ensuring transportation networks are available despite the constant threat of natural disasters and terrorist activities. Most of the previous transportation network vulnerability research has been performed in the context ofmore » static traffic models, many of which are formulated as traditional optimization problems. However, transportation networks are dynamic because their usage varies over time. Thus, more appropriate methods to characterize the vulnerability of transportation networks should consider their dynamic properties. This paper presents a quantitative approach to assess the vulnerability of a transportation network to disruptions with methods from traffic simulation. Our approach can prioritize the critical links over time and is generalizable to the case where both link and node disruptions are of concern. We illustrate the approach through a series of examples. Our results demonstrate that the approach provides quantitative insight into the time varying criticality of links. Such an approach could be used as the objective function of less traditional optimization methods that use simulation and other techniques to evaluate the relative utility of a particular network defense to reduce vulnerability and increase resilience.« less

  7. An integrative approach to the assessment of narcissism.

    PubMed

    Roche, Michael J; Pincus, Aaron L; Lukowitsky, Mark R; Ménard, Kim S; Conroy, David E

    2013-01-01

    Narcissism research is poorly calibrated across fields of study in part due to confusion over how to integrate normal and pathological descriptions of narcissism. We argue that pathological and normal narcissism can be integrated in a single model that organizes around self-regulation mechanisms. We present theoretical and empirical support for this interpretation, and demonstrate that modeling pathological and normal narcissism as 2 dimensions underlying the narcissistic character can help to resolve some of the inconsistencies in the field regarding how to best assess adaptive and maladaptive expressions of narcissism.

  8. Integrated Science Assessment (ISA) of Ozone and Related ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Integrated Science Assessment of Ozone and Related Photochemical Oxidants. This document represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding the adequacy of the current national ambient air quality standards for ozone to protect human health, public welfare, and the environment. Critical evaluation and integration of the evidence on health and environmental effects of ozone to provide scientific support for the review of the NAAQS for ozone.

  9. Projecting state-level air pollutant emissions using an integrated assessment model: GCAM-USA

    EPA Science Inventory

    The Global Change Assessment Model (GCAM) is an integrated assessment model that links representations of the economy, energy sector, land use, and climate within an integrated modeling environment. GCAM-USA, which is an extension of GCAM, provides U.S. state-level resolution wit...

  10. Integrating modeling and surveys for more effective assessments

    EPA Science Inventory

    A false dichotomy currently exists in monitoring that pits sample surveys based on probability designs against targeted monitoring of hand-picked sites. We maintain that judicious use of both, when designed to be integrated, produces assessments of greater value than either inde...

  11. Probabilistic Integrated Assessment of ``Dangerous'' Climate Change

    NASA Astrophysics Data System (ADS)

    Mastrandrea, Michael D.; Schneider, Stephen H.

    2004-04-01

    Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.

  12. A Mobile Food Record For Integrated Dietary Assessment*

    PubMed Central

    Ahmad, Ziad; Kerr, Deborah A.; Bosch, Marc; Boushey, Carol J.; Delp, Edward J.; Khanna, Nitin; Zhu, Fengqing

    2017-01-01

    This paper presents an integrated dietary assessment system based on food image analysis that uses mobile devices or smartphones. We describe two components of our integrated system: a mobile application and an image-based food nutrient database that is connected to the mobile application. An easy-to-use mobile application user interface is described that was designed based on user preferences as well as the requirements of the image analysis methods. The user interface is validated by user feedback collected from several studies. Food nutrient and image databases are also described which facilitates image-based dietary assessment and enable dietitians and other healthcare professionals to monitor patients dietary intake in real-time. The system has been tested and validated in several user studies involving more than 500 users who took more than 60,000 food images under controlled and community-dwelling conditions. PMID:28691119

  13. Integrated Environmental Health Impact Assessment for Risk Governance Purposes; Across What Do We Integrate?

    PubMed

    Lebret, Erik

    2015-12-23

    Integrated Environmental Health Impact Assessment (IEHIA) can be considered as an element in the third phase of environmental risk management. Its focus is on providing inclusive descriptions of multiple impacts from multiple stressors in such a way that they can be evaluated against the potential societal benefits of the causes of the stressors. This paper emphasises some differences and difficulties in the integration across professional paradigms and scientific fields, across stakeholder perspectives and differences in impact indicators that emanate from these different fields and paradigms.

  14. Integrated Environmental Health Impact Assessment for Risk Governance Purposes; Across What Do We Integrate?

    PubMed Central

    Lebret, Erik

    2015-01-01

    Integrated Environmental Health Impact Assessment (IEHIA) can be considered as an element in the third phase of environmental risk management. Its focus is on providing inclusive descriptions of multiple impacts from multiple stressors in such a way that they can be evaluated against the potential societal benefits of the causes of the stressors. This paper emphasises some differences and difficulties in the integration across professional paradigms and scientific fields, across stakeholder perspectives and differences in impact indicators that emanate from these different fields and paradigms. PMID:26703709

  15. Integrative Application of Life Cycle Assessment and Risk Assessment to Environmental Impacts of Anthropogenic Pollutants at a Watershed Scale.

    PubMed

    Lin, Xiaodan; Yu, Shen; Ma, Hwongwen

    2018-01-01

    Intense human activities have led to increasing deterioration of the watershed environment via pollutant discharge, which threatens human health and ecosystem function. To meet a need of comprehensive environmental impact/risk assessment for sustainable watershed development, a biogeochemical process-based life cycle assessment and risk assessment (RA) integration for pollutants aided by geographic information system is proposed in this study. The integration is to frame a conceptual protocol of "watershed life cycle assessment (WLCA) for pollutants". The proposed WLCA protocol consists of (1) geographic and environmental characterization mapping; (2) life cycle inventory analysis; (3) integration of life-cycle impact assessment (LCIA) with RA via characterization factor of pollutant of interest; and (4) result analysis and interpretation. The WLCA protocol can visualize results of LCIA and RA spatially for the pollutants of interest, which might be useful for decision or policy makers for mitigating impacts of watershed development.

  16. Quantitative phase imaging for enhanced assessment of optomechanical cancer cell properties

    NASA Astrophysics Data System (ADS)

    Kastl, Lena; Kemper, Björn; Schnekenburger, Jürgen

    2018-02-01

    Optical cell stretching provides label-free investigations of cells by measuring their biomechanical properties based on deformability determination in a fiber optical two-beam trap. However, the stretching forces in this two-beam laser trap depend on the optical properties of the investigated specimen. Therefore, we characterized in parallel four cancer cell lines with varying degree of differentiation utilizing quantitative phase imaging (QPI) and optical cell stretching. The QPI data allowed enhanced assessment of the mechanical cell properties measured with the optical cell stretcher and demonstrates the high potential of cell phenotyping when both techniques are combined.

  17. Developing and applying quantitative skills maps for STEM curricula, with a focus on different modes of learning

    NASA Astrophysics Data System (ADS)

    Reid, Jackie; Wilkes, Janelle

    2016-08-01

    Mapping quantitative skills across the science, technology, engineering and mathematics (STEM) curricula will help educators identify gaps and duplication in the teaching, practice and assessment of the necessary skills. This paper describes the development and implementation of quantitative skills mapping tools for courses in STEM at a regional university that offers both on-campus and distance modes of study. Key elements of the mapping project included the identification of key graduate quantitative skills, the development of curriculum mapping tools to record in which unit(s) and at what level of attainment each quantitative skill is taught, practised and assessed, and identification of differences in the way quantitative skills are developed for on-campus and distance students. Particular attention is given to the differences that are associated with intensive schools, which consist of concentrated periods of face-to-face learning over a three-four day period, and are available to distance education students enrolled in STEM units. The detailed quantitative skills mapping process has had an impact on the review of first-year mathematics units, resulted in crucial changes to the curriculum in a number of courses, and contributed to a more integrated approach, and a collective responsibility, to the development of students' quantitative skills for both face-to-face and online modes of learning.

  18. Assessing the performance of quantitative image features on early stage prediction of treatment effectiveness for ovary cancer patients: a preliminary investigation

    NASA Astrophysics Data System (ADS)

    Zargari, Abolfazl; Du, Yue; Thai, Theresa C.; Gunderson, Camille C.; Moore, Kathleen; Mannel, Robert S.; Liu, Hong; Zheng, Bin; Qiu, Yuchen

    2018-02-01

    The objective of this study is to investigate the performance of global and local features to better estimate the characteristics of highly heterogeneous metastatic tumours, for accurately predicting the treatment effectiveness of the advanced stage ovarian cancer patients. In order to achieve this , a quantitative image analysis scheme was developed to estimate a total of 103 features from three different groups including shape and density, Wavelet, and Gray Level Difference Method (GLDM) features. Shape and density features are global features, which are directly applied on the entire target image; wavelet and GLDM features are local features, which are applied on the divided blocks of the target image. To assess the performance, the new scheme was applied on a retrospective dataset containing 120 recurrent and high grade ovary cancer patients. The results indicate that the three best performed features are skewness, root-mean-square (rms) and mean of local GLDM texture, indicating the importance of integrating local features. In addition, the averaged predicting performance are comparable among the three different categories. This investigation concluded that the local features contains at least as copious tumour heterogeneity information as the global features, which may be meaningful on improving the predicting performance of the quantitative image markers for the diagnosis and prognosis of ovary cancer patients.

  19. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  20. Better Assessment Science Integrating Point and Nonpoint Sources

    EPA Science Inventory

    Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is not a model per se, but is a multipurpose environmental decision support system for use by regional, state, and local agencies in performing watershed- and water-quality-based studies. BASI...

  1. Correlation between quantitative whole-body muscle magnetic resonance imaging and clinical muscle weakness in Pompe disease.

    PubMed

    Horvath, Jeffrey J; Austin, Stephanie L; Case, Laura E; Greene, Karla B; Jones, Harrison N; Soher, Brian J; Kishnani, Priya S; Bashir, Mustafa R

    2015-05-01

    Previous examination of whole-body muscle involvement in Pompe disease has been limited to physical examination and/or qualitative magnetic resonance imaging (MRI). In this study we assess the feasibility of quantitative proton-density fat-fraction (PDFF) whole-body MRI in late-onset Pompe disease (LOPD) and compare the results with manual muscle testing. Seven LOPD patients and 11 disease-free controls underwent whole-body PDFF MRI. Quantitative MR muscle group assessments were compared with physical testing of muscle groups. The 95% upper limits of confidence intervals for muscle groups were 4.9-12.6% in controls and 6.8-76.4% in LOPD patients. LOPD patients showed severe and consistent tongue and axial muscle group involvement, with less marked involvement of peripheral musculature. MRI was more sensitive than physical examination for detection of abnormality in multiple muscle groups. This integrated, quantitative approach to muscle assessment provides more detailed data than physical examination and may have clinical utility for monitoring disease progression and treatment response. © 2014 Wiley Periodicals, Inc.

  2. Academic Workload Implications of Assessing Student Learning in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Bilgin, Ayse A.; Rowe, Anna D.; Clark, Lindie

    2017-01-01

    Assessment of student learning is a crucial part of quality work—integrated learning (WIL), yet presents some significant challenges for WIL practitioners. Assessment of WIL differs to assessment in classroom based courses because of the complexities of assessing the more holistic nature of learning in WIL, as well as (in many cases)…

  3. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  4. An Integrated Strategy for Global Qualitative and Quantitative Profiling of Traditional Chinese Medicine Formulas: Baoyuan Decoction as a Case

    NASA Astrophysics Data System (ADS)

    Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2016-12-01

    Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.

  5. Integrated Medical Model Overview

    NASA Technical Reports Server (NTRS)

    Myers, J.; Boley, L.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; Saile, L.; hide

    2015-01-01

    The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.

  6. Quantitative assessment of Cerenkov luminescence for radioguided brain tumor resection surgery

    NASA Astrophysics Data System (ADS)

    Klein, Justin S.; Mitchell, Gregory S.; Cherry, Simon R.

    2017-05-01

    Cerenkov luminescence imaging (CLI) is a developing imaging modality that detects radiolabeled molecules via visible light emitted during the radioactive decay process. We used a Monte Carlo based computer simulation to quantitatively investigate CLI compared to direct detection of the ionizing radiation itself as an intraoperative imaging tool for assessment of brain tumor margins. Our brain tumor model consisted of a 1 mm spherical tumor remnant embedded up to 5 mm in depth below the surface of normal brain tissue. Tumor to background contrast ranging from 2:1 to 10:1 were considered. We quantified all decay signals (e±, gamma photon, Cerenkov photons) reaching the brain volume surface. CLI proved to be the most sensitive method for detecting the tumor volume in both imaging and non-imaging strategies as assessed by contrast-to-noise ratio and by receiver operating characteristic output of a channelized Hotelling observer.

  7. Quantitative assessment of arm tremor in people with neurological disorders.

    PubMed

    Jeonghee Kim; Parnell, Claire; Wichmann, Thomas; DeWeerth, Stephen P

    2016-08-01

    Abnormal oscillatory movement (i.e. tremor) is usually evaluated with qualitative assessment by clinicians, and quantified with subjective scoring methods. These methods are often inaccurate. We utilized a quantitative and standardized task based on the Fitts' law to assess the performance of arm movement with tremor by controlling a gyration mouse on a computer. The experiment included the center-out tapping (COT) and rectangular track navigation (RTN) tasks. We report the results of a pilot study in which we collected the performance for healthy participants in whom tremor was simulated by imposing oscillatory movements to the arm with a vibration motor. We compared their movement speed and accuracy with and without the artificial "tremor." We found that the artificial tremor significantly affected the path efficiency for both tasks (COT: 56.8 vs. 46.2%, p <; 0.05; RTN: 94.2 vs. 67.4%, p <; 0.05), and we were able to distinguish the presence of tremor. From this result, we expect to quantify severity of tremor and the effectiveness therapy for tremor patients.

  8. Development of quantitative analysis method for stereotactic brain image: assessment of reduced accumulation in extent and severity using anatomical segmentation.

    PubMed

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-06-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on NINCDS-ADRDA, we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-SSP program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution.

  9. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  10. Ultrasound Assessment of Human Meniscus.

    PubMed

    Viren, Tuomas; Honkanen, Juuso T; Danso, Elvis K; Rieppo, Lassi; Korhonen, Rami K; Töyräs, Juha

    2017-09-01

    The aim of the present study was to evaluate the applicability of ultrasound imaging to quantitative assessment of human meniscus in vitro. Meniscus samples (n = 26) were harvested from 13 knee joints of non-arthritic human cadavers. Subsequently, three locations (anterior, center and posterior) from each meniscus were imaged with two ultrasound transducers (frequencies 9 and 40 MHz), and quantitative ultrasound parameters were determined. Furthermore, partial-least-squares regression analysis was applied for ultrasound signal to determine the relations between ultrasound scattering and meniscus integrity. Significant correlations between measured and predicted meniscus compositions and mechanical properties were obtained (R 2  = 0.38-0.69, p < 0.05). The relationship between conventional ultrasound parameters and integrity of the meniscus was weaker. To conclude, ultrasound imaging exhibited a potential for evaluation of meniscus integrity. Higher ultrasound frequency combined with multivariate analysis of ultrasound backscattering was found to be the most sensitive for evaluation of meniscus integrity. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  11. Using Rubrics to Assess Learning in Course-Integrated Library Instruction

    ERIC Educational Resources Information Center

    Gariepy, Laura W.; Stout, Jennifer A.; Hodge, Megan L.

    2016-01-01

    Librarians face numerous challenges when designing effective, sustainable methods for assessing student learning outcomes in one-shot, course-integrated library instruction sessions. We explore the use of rubrics to programmatically assess authentic learning exercises completed in one-shot library sessions for a large, required sophomore-level…

  12. Beyond Punnett Squares: Student Word Association and Explanations of Phenotypic Variation through an Integrative Quantitative Genetics Unit Investigating Anthocyanin Inheritance and Expression in Brassica rapa Fast Plants

    PubMed Central

    Smith, Amber R.; Williams, Paul H.; McGee, Seth A.; Dósa, Katalin; Pfammatter, Jesse

    2014-01-01

    Genetics instruction in introductory biology is often confined to Mendelian genetics and avoids the complexities of variation in quantitative traits. Given the driving question “What determines variation in phenotype (Pv)? (Pv=Genotypic variation Gv + environmental variation Ev),” we developed a 4-wk unit for an inquiry-based laboratory course focused on the inheritance and expression of a quantitative trait in varying environments. We utilized Brassica rapa Fast Plants as a model organism to study variation in the phenotype anthocyanin pigment intensity. As an initial curriculum assessment, we used free word association to examine students’ cognitive structures before and after the unit and explanations in students’ final research posters with particular focus on variation (Pv = Gv + Ev). Comparison of pre- and postunit word frequency revealed a shift in words and a pattern of co-occurring concepts indicative of change in cognitive structure, with particular focus on “variation” as a proposed threshold concept and primary goal for students’ explanations. Given review of 53 posters, we found ∼50% of students capable of intermediate to high-level explanations combining both Gv and Ev influence on expression of anthocyanin intensity (Pv). While far from “plug and play,” this conceptually rich, inquiry-based unit holds promise for effective integration of quantitative and Mendelian genetics. PMID:25185225

  13. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  14. Should different impact assessment instruments be integrated? Evidence from English spatial planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, Ryo, E-mail: tajima.ryo@nies.go.jp; Fischer, Thomas B., E-mail: fischer@liverpool.ac.uk

    This paper aims at providing empirical evidence to the question as to whether integration of different instruments is achieving its aim in supporting sustainable decision making, focusing on SEA inclusive sustainability appraisal (SA) and other impact assessments (IAs) currently used in English spatial planning. Usage of IAs in addition to SA is established and an analysis of the integration approach (in terms of process, output, and assessor) as well as its effectiveness is conducted. It is found that while integration enhances effectiveness to some extent, too much integration, especially in terms of the procedural element, appears to diminish the overallmore » effectiveness of each IA in influencing decisions as they become captured by the balancing function of SA. -- Highlights: ► The usage of different impact assessments in English spatial planning is clarified. ► The relationship between integration approach and effectiveness is analyzed. ► Results suggest that integration does not necessarily lead to more sustainable decisions. ► Careful consideration is recommended upon process integration.« less

  15. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative

  16. Assessing Student Status and Progress in Science Reasoning and Quantitative Literacy at a Very Large Undergraduate Institution

    NASA Astrophysics Data System (ADS)

    Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.

    2009-01-01

    The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.

  17. Co-Teaching in Middle School Classrooms: Quantitative Comparative Study of Special Education Student Assessment Performance

    ERIC Educational Resources Information Center

    Reese, De'borah Reese

    2017-01-01

    The purpose of this quantitative comparative study was to determine the existence or nonexistence of performance pass rate differences of special education middle school students on standardized assessments between pre and post co-teaching eras disaggregated by subject area and school. Co-teaching has altered classroom environments in many ways.…

  18. Using models in Integrated Ecosystem Assessment of coastal areas

    NASA Astrophysics Data System (ADS)

    Solidoro, Cosimo; Bandelj, Vinko; Cossarini, Gianpiero; Melaku Canu, Donata; Libralato, Simone

    2014-05-01

    Numerical Models can greatly contribute to integrated ecological assessment of coastal and marine systems. Indeed, models can: i) assist in the identification of efficient sampling strategy; ii) provide space interpolation and time extrapolation of experiemtanl data which are based on the knowedge on processes dynamics and causal realtionships which is coded within the model, iii) provide estimates of hardly measurable indicators. Furthermore model can provide indication on potential effects of implementation of alternative management policies. Finally, by providing a synthetic representation of an ideal system, based on its essential dynamic, model return a picture of ideal behaviour of a system in the absence of external perturbation, alteration, noise, which might help in the identification of reference behaivuor. As an important example, model based reanalyses of biogeochemical and ecological properties are an urgent need for the estimate of the environmental status and the assessment of efficacy of conservation and environmental policies, also with reference to the enforcement of the European MSFD. However, the use of numerical models, and particularly of ecological models, in modeling and in environmental management still is far from be the rule, possibly because of a lack in realizing the benefits which a full integration of modeling and montoring systems might provide, possibly because of a lack of trust in modeling results, or because many problems still exists in the development, validation and implementation of models. For istance, assessing the validity of model results is a complex process that requires the definition of appropriate indicators, metrics, methodologies and faces with the scarcity of real-time in-situ biogeochemical data. Furthermore, biogeochemical models typically consider dozens of variables which are heavily undersampled. Here we show how the integration of mathematical model and monitoring data can support integrated ecosystem

  19. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Integrating Academic and Clinical Learning Using a Clinical Swallowing Assessment

    ERIC Educational Resources Information Center

    Phillips, Daniel E.

    2013-01-01

    This article describes an experiential learning activity designed to integrate classroom knowledge and a clinical swallowing assessment. Twenty master's-level graduate students in a dysphagia course conducted a clinical swallowing assessment with a resident of an independent retirement community. The exercise was designed to allow students an…

  1. Formative Assessment Design for PDA Integrated Ecology Observation

    ERIC Educational Resources Information Center

    Hung, Pi-Hsia; Lin, Yu-Fen; Hwang, Gwo-Jen

    2010-01-01

    Ubiquitous computing and mobile technologies provide a new perspective for designing innovative outdoor learning experiences. The purpose of this study is to propose a formative assessment design for integrating PDAs into ecology observations. Three learning activities were conducted in this study. An action research approach was applied to…

  2. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE PAGES

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe; ...

    2017-04-07

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  3. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  4. Opportunities and challenges of integrating ecological restoration into assessment and management of contaminated ecosystems.

    PubMed

    Hull, Ruth N; Luoma, Samuel N; Bayne, Bruce A; Iliff, John; Larkin, Daniel J; Paschke, Mark W; Victor, Sasha L; Ward, Sara E

    2016-04-01

    Ecosystem restoration planning near the beginning of the site assessment and management process ("early integration") involves consideration of restoration goals from the outset in developing solutions for contaminated ecosystems. There are limitations to integration that stem from institutional barriers, few successful precedents, and limited availability of guidance. Challenges occur in integrating expertise from various disciplines and multiple, sometimes divergent interests and goals. The more complex process can result in timing, capacity, communication, and collaboration challenges. On the other hand, integrating the 2 approaches presents new and creative opportunities. For example, integration allows early planning for expanding ecosystem services on or near contaminated lands or waters that might otherwise have been unaddressed by remediation alone. Integrated plans can explicitly pursue ecosystem services that have market value, which can add to funds for long-term monitoring and management. Early integration presents opportunities for improved and productive collaboration and coordination between ecosystem restoration and contaminant assessment and management. Examples exist where early integration facilitates liability resolution and generates positive public relations. Restoration planning and implementation before the completion of the contaminated site assessment, remediation, or management process ("early restoration") can facilitate coordination with offsite restoration options and a regional approach to restoration of contaminated environments. Integration of performance monitoring, for both remedial and restoration actions, can save resources and expand the interpretive power of results. Early integration may aid experimentation, which may be more feasible on contaminated lands than in many other situations. The potential application of concepts and tools from adaptive management is discussed as a way of avoiding pitfalls and achieving benefits in

  5. Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study

    PubMed Central

    Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger

    2015-01-01

    Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle

  6. Participatory quantitative health impact assessment of urban and transport planning in cities: A review and research needs.

    PubMed

    Nieuwenhuijsen, Mark J; Khreis, Haneen; Verlinghieri, Ersilia; Mueller, Natalie; Rojas-Rueda, David

    2017-06-01

    Urban and transport planning have large impacts on public health, but these are generally not explicitly considered and/or quantified, partly because there are no comprehensive models, methods and tools readily available. Air pollution, noise, temperature, green space, motor vehicle crashes and physical activity are important pathways linking urban and transport planning and public health. For policy decision-making, it is important to understand and be able to quantify the full-chain from source through pathways to health effects and impacts to substantiate and effectively target actions. In this paper, we aim to provide an overview of recent studies on the health impacts related to urban and transport planning in cities, describe the need for novel participatory quantitative health impact assessments (HIA) and provide recommendations. To devise our searches and narrative, we were guided by a recent conceptual framework linking urban and transport planning, environmental exposures, behaviour and health. We searched PubMed, Web of Science, Science Direct, and references from relevant articles in English language from January 1, 1980, to November 1, 2016, using pre-defined search terms. The number of HIA studies is increasing rapidly, but there is lack of participatory integrated and full-chain HIA models, methods and tools. These should be based on the use of a systemic multidisciplinary/multisectorial approach and state-of-the-art methods to address questions such as what are the best, most feasible and needed urban and transport planning policy measures to improve public health in cities? Active citizen support and new forms of communication between experts and citizens and the involvement of all major stakeholders are crucial to find and successfully implement health promoting policy measures. We provided an overview of the current state-of-the art of HIA in cities and made recommendations for further work. The process on how to get there is as important and

  7. INCORPORATING CATASTROPHES INTO INTEGRATED ASSESSMENT: SCIENCE, IMPACTS, AND ADAPTATION

    EPA Science Inventory

    Incorporating potential catastrophic consequences into integrated assessment models of climate change has been a top priority of policymakers and modelers alike. We review the current state of scientific understanding regarding three frequently mentioned geophysical catastrophes,...

  8. A multi-disciplinary approach for the integrated assessment of water alterations under climate change

    NASA Astrophysics Data System (ADS)

    Sperotto, Anna; Torresan, Silvia; Molina, Jose Luis; Pulido Velazquez, Manuel; Critto, Andrea; Marcomini, Antonio

    2017-04-01

    Understanding the co-evolution and interrelations between natural and human pressures on water systems is required to ensure a sustainable management of resources under uncertain climate change conditions. To pursue multi-disciplinary research is therefore necessary to consider the multiplicity of stressors affecting water resources, take into account alternative perspectives (i.e. social, economic and environmental objective and priorities) and deal with uncertainty which characterize climate change scenarios. However, approaches commonly adopted in water quality assessment are predominantly mono-disciplinary, single-stressors oriented and apply concepts and models specific of different academic disciplines (e.g. physics, hydrology, ecology, sociology, economy) which, in fact, seldom shed their conceptual blinders failing to provide truly integrated results. In this context, the paper discusses the benefits and limits of adopting a multi-disciplinary approach where different knowledge domains collaborate and quantitative and qualitative information, coming from multiple conceptual and model-based research, are integrated in a harmonic manner. Specifically, Bayesian Networks are used as meta-modelling tool for structuring and combining the probabilistic information available in existing hydrological models, climate change and land use projections, historical observations and expert opinion. The developed network allows to perform a stochastic multi-risk assessment considering the interlacing between climate (i.e. irregularities in water regime) and land use changes (i.e. agriculture, urbanization) and their cascading impacts on water quality parameters (i.e. nutrients loadings). Main objective of the model is the development of multi-risk scenarios to assess and communicate the probability of not meeting a "Good chemical water status" over future timeframe taking into account projected climatic and not climatic conditions. The outcomes are finally used to identify

  9. "HIP" new software: The Hydroecological Integrity Assessment Process

    USGS Publications Warehouse

    Henriksen, Jim; Wilson, Juliette T.

    2006-01-01

    Center (FORT) have developed the Hydroecological Integrity Assessment Process (HIP) and a suite of software tools for conducting a hydrologic classification of streams, addressing instream flow needs, and assessing past and proposed hydrologic alterations on streamflow and other ecosystem components. The HIP recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and habitats. Streamflow is considered a “master variable” that limits the distribution, abundance, and diversity of many aquatic plant and animal species.

  10. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  11. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  12. Ultrasonic Structural Health Monitoring to Assess the Integrity of Spinal Growing Rods In Vitro.

    PubMed

    Oetgen, Matthew E; Goodley, Addison; Yoo, Byungseok; Pines, Darryll J; Hsieh, Adam H

    2016-01-01

    Rod fracture is a common complication of growing rods and can result in loss of correction, patient discomfort, and unplanned revision surgery. The ability to quantitate rod integrity at each lengthening would be advantageous to avoid this complication. We investigate the feasibility of applying structural health monitoring to evaluate the integrity of growing rods in vitro. Single-rod titanium 4.5-mm growing rod constructs (n = 9), one screw proximally and one distally connected by in-line connectors, were assembled with pedicle screws fixed in polyethylene blocks. Proximal and distal ends were loaded and constructs subjected to cyclic axial compression (0-100 N at 1 Hz), with incrementally increasing maximum compressive loads of 10 N every 9k cycles until failure. Four piezoceramic transducers (PZTs) were mounted along the length the constructs to interrogate the integrity of the rods with an ultrasonic, guided lamb wave approach. Every 9k cycles, an 80 V excitatory voltage was applied to a PZT to generate high-frequency vibrations, which, after propagating through the construct, was detected by the remaining PZTs. Amplitude differences between pre- and postload waveform signals were calculated until rod failure. Average construct lifetime was 88,991 ± 13,398 cycles. All constructs failed due to rod fracture within 21 mm (mean = 15 ± 4.5 mm) of a screw or connector. Amplitude differences between pre- and postload increased in a stepwise fashion as constructs were cycled. Compared to baseline, we found a 1.8 ± 0.6-fold increase in amplitude 18k cycles before failure, a 2.2 ± 1.0-fold increase in amplitude 9k cycles before failure, and a 2.75 ± 1.5-fold increase in amplitude immediately before rod fracture. We describe a potential method for assessing the structural integrity of growing rods using ultrasonic structural health monitoring. These preliminary data demonstrate the ability of periodic rod assessment to detect structural changes in cycled growing

  13. Toward a procedure for integrating moral issues in health technology assessment.

    PubMed

    Hofmann, Bjørn

    2005-01-01

    Although ethics has been on the agenda in health technology assessment (HTA) since its inception, the integration of moral issues is still not standard and is performed in a vast variety of ways. Therefore, there is a need for a procedure for integrating moral issues in HTA. Literature review of existing approaches together with application of various theories in moral philosophy and axiology. The article develops a set of questions that addresses a wide range of moral issues related to the assessment and implementation of health technology. The issues include general moral issues and moral issues related to stakeholders, methodology, characteristics of technology, and to the HTA process itself. The questions form a kind of checklist for use in HTAs. The presented approach for integrating moral issues in HTA has a broad theoretical foundation and has shown to be useful in practice. Integrating ethical issues in HTAs can be of great importance with respect to the dissemination of HTA results and in efficient health policy making.

  14. Needs Assessments: An Integrated Assignment in Civic Service

    ERIC Educational Resources Information Center

    Norris, Debra S.; Schwartz, Charles L.

    2009-01-01

    An undergraduate social work program developed a service-learning experience in partnership with a local United Way organization to complete a community needs assessment project. The experience integrated the curricula of a social work research methods course and a generalist-macro practice course with the principles and actions of experiential…

  15. Dietary intake assessment using integrated sensors and software

    NASA Astrophysics Data System (ADS)

    Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander

    2012-02-01

    The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.

  16. Pain assessment scales in newborns: integrative review

    PubMed Central

    de Melo, Gleicia Martins; Lélis, Ana Luíza Paula de Aguiar; de Moura, Alline Falconieri; Cardoso, Maria Vera Lúcia Moreira Leitão; da Silva, Viviane Martins

    2014-01-01

    OBJECTIVE: To analyze studies on methods used to assess pain in newborns. DATA SOURCES: Integrative review study of articles published from 2001 to 2012, carried out in the following databases: Scopus, PubMed, CINAHL, LILACS and Cochrane. The sample consisted of 13 articles with level of evidence 5. DATA SYNTHESIS: 29 pain assessment scales in newborns, including 13 one-dimensional and 16 multidimensional, that assess acute and prolonged pain in preterm and full-term infants were available in scientific publications. CONCLUSION: Based on the characteristics of scales, one cannot choose a single one as the most appropriate scale, as this choice will depend on gestational age, type of painful stimulus and the environment in which the infant is inserted. It is suggested the use of multidimensional or one-dimensional scales; however, they must be reliable and validated. PMID:25511005

  17. Extrapolating cetacean densities to quantitatively assess human impacts on populations in the high seas.

    PubMed

    Mannocci, Laura; Roberts, Jason J; Miller, David L; Halpin, Patrick N

    2017-06-01

    As human activities expand beyond national jurisdictions to the high seas, there is an increasing need to consider anthropogenic impacts to species inhabiting these waters. The current scarcity of scientific observations of cetaceans in the high seas impedes the assessment of population-level impacts of these activities. We developed plausible density estimates to facilitate a quantitative assessment of anthropogenic impacts on cetacean populations in these waters. Our study region extended from a well-surveyed region within the U.S. Exclusive Economic Zone into a large region of the western North Atlantic sparsely surveyed for cetaceans. We modeled densities of 15 cetacean taxa with available line transect survey data and habitat covariates and extrapolated predictions to sparsely surveyed regions. We formulated models to reduce the extent of extrapolation beyond covariate ranges, and constrained them to model simple and generalizable relationships. To evaluate confidence in the predictions, we mapped where predictions were made outside sampled covariate ranges, examined alternate models, and compared predicted densities with maps of sightings from sources that could not be integrated into our models. Confidence levels in model results depended on the taxon and geographic area and highlighted the need for additional surveying in environmentally distinct areas. With application of necessary caution, our density estimates can inform management needs in the high seas, such as the quantification of potential cetacean interactions with military training exercises, shipping, fisheries, and deep-sea mining and be used to delineate areas of special biological significance in international waters. Our approach is generally applicable to other marine taxa and geographic regions for which management will be implemented but data are sparse. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  18. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  19. Integrated approach to assess ecosystem health in harbor areas.

    PubMed

    Bebianno, M J; Pereira, C G; Rey, F; Cravo, A; Duarte, D; D'Errico, G; Regoli, F

    2015-05-01

    Harbors are critical environments with strategic economic importance but with potential environmental impact: health assessment criteria are a key issue. An ecosystem health status approach was carried out in Portimão harbor as a case-study. Priority and specific chemical levels in sediments along with their bioavailability in mussels, bioassays and a wide array of biomarkers were integrated in a biomarker index (IBR index) and the overall data in a weight of evidence (WOE) model. Metals, PAHs, PCBs and HCB were not particularly high compared with sediment guidelines and standards for dredging. Bioavailability was evident for Cd, Cu and Zn. Biomarkers proved more sensitive namely changes of antioxidant responses, metallothioneins and vittellogenin-like proteins. IBR index indicated that site 4 was the most impacted area. Assessment of the health status by WOE approach highlighted the importance of integrating sediment chemistry, bioaccumulation, biomarkers and bioassays and revealed that despite some disturbance in the harbor area, there was also an impact of urban effluents from upstream. Environmental quality assessment in harbors. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. The integration of quantitative genetics, paleontology, and neontology reveals genetic underpinnings of primate dental evolution.

    PubMed

    Hlusko, Leslea J; Schmitt, Christopher A; Monson, Tesla A; Brasil, Marianne F; Mahaney, Michael C

    2016-08-16

    Developmental genetics research on mice provides a relatively sound understanding of the genes necessary and sufficient to make mammalian teeth. However, mouse dentitions are highly derived compared with human dentitions, complicating the application of these insights to human biology. We used quantitative genetic analyses of data from living nonhuman primates and extensive osteological and paleontological collections to refine our assessment of dental phenotypes so that they better represent how the underlying genetic mechanisms actually influence anatomical variation. We identify ratios that better characterize the output of two dental genetic patterning mechanisms for primate dentitions. These two newly defined phenotypes are heritable with no measurable pleiotropic effects. When we consider how these two phenotypes vary across neontological and paleontological datasets, we find that the major Middle Miocene taxonomic shift in primate diversity is characterized by a shift in these two genetic outputs. Our results build on the mouse model by combining quantitative genetics and paleontology, and thereby elucidate how genetic mechanisms likely underlie major events in primate evolution.

  1. Impact assessment of integrated dynamic transit operations : final report.

    DOT National Transportation Integrated Search

    2016-03-02

    This document details the impact assessment conducted by the Volpe Center for the Integrated Dynamic Transit Operations (IDTO) prototypedemonstrations in Columbus, Ohio and Central Florida. The prototype is one result of the U.S. Department of Transp...

  2. Impact Assessment of GNSS Spoofing Attacks on INS/GNSS Integrated Navigation System.

    PubMed

    Liu, Yang; Li, Sihai; Fu, Qiangwen; Liu, Zhenbo

    2018-05-04

    In the face of emerging Global Navigation Satellite System (GNSS) spoofing attacks, there is a need to give a comprehensive analysis on how the inertial navigation system (INS)/GNSS integrated navigation system responds to different kinds of spoofing attacks. A better understanding of the integrated navigation system’s behavior with spoofed GNSS measurements gives us valuable clues to develop effective spoofing defenses. This paper focuses on an impact assessment of GNSS spoofing attacks on the integrated navigation system Kalman filter’s error covariance, innovation sequence and inertial sensor bias estimation. A simple and straightforward measurement-level trajectory spoofing simulation framework is presented, serving as the basis for an impact assessment of both unsynchronized and synchronized spoofing attacks. Recommendations are given for spoofing detection and mitigation based on our findings in the impact assessment process.

  3. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  4. Cells and Stripes: A novel quantitative photo-manipulation technique

    PubMed Central

    Mistrik, Martin; Vesela, Eva; Furst, Tomas; Hanzlikova, Hana; Frydrych, Ivo; Gursky, Jan; Majera, Dusana; Bartek, Jiri

    2016-01-01

    Laser micro-irradiation is a technology widely used in the DNA damage response, checkpoint signaling, chromatin remodeling and related research fields, to assess chromatin modifications and recruitment of diverse DNA damage sensors, mediators and repair proteins to sites of DNA lesions. While this approach has aided numerous discoveries related to cell biology, maintenance of genome integrity, aging and cancer, it has so far been limited by a tedious manual definition of laser-irradiated subcellular regions, with the ensuing restriction to only a small number of cells treated and analyzed in a single experiment. Here, we present an improved and versatile alternative to the micro-irradiation approach: Quantitative analysis of photo-manipulated samples using innovative settings of standard laser-scanning microscopes. Up to 200 cells are simultaneously exposed to a laser beam in a defined pattern of collinear rays. The induced striation pattern is then automatically evaluated by a simple algorithm, which provides a quantitative assessment of various laser-induced phenotypes in live or fixed cells. Overall, this new approach represents a more robust alternative to existing techniques, and provides a versatile tool for a wide range of applications in biomedicine. PMID:26777522

  5. Integration analysis of quantitative proteomics and transcriptomics data identifies potential targets of frizzled-8 protein-related antiproliferative factor in vivo.

    PubMed

    Yang, Wei; Kim, Yongsoo; Kim, Taek-Kyun; Keay, Susan K; Kim, Kwang Pyo; Steen, Hanno; Freeman, Michael R; Hwang, Daehee; Kim, Jayoung

    2012-12-01

    What's known on the subject? and What does the study add? Interstitial cystitis (IC) is a prevalent and debilitating pelvic disorder generally accompanied by chronic pain combined with chronic urinating problems. Over one million Americans are affected, especially middle-aged women. However, its aetiology or mechanism remains unclear. No efficient drug has been provided to patients. Several urinary biomarker candidates have been identified for IC; among the most promising is antiproliferative factor (APF), whose biological activity is detectable in urine specimens from >94% of patients with both ulcerative and non-ulcerative IC. The present study identified several important mediators of the effect of APF on bladder cell physiology, suggesting several candidate drug targets against IC. In an attempt to identify potential proteins and genes regulated by APF in vivo, and to possibly expand the APF-regulated network identified by stable isotope labelling by amino acids in cell culture (SILAC), we performed an integration analysis of our own SILAC data and the microarray data of Gamper et al. (2009) BMC Genomics 10: 199. Notably, two of the proteins (i.e. MAPKSP1 and GSPT1) that are down-regulated by APF are involved in the activation of mTORC1, suggesting that the mammalian target of rapamycin (mTOR) pathway is potentially a critical pathway regulated by APF in vivo. Several components of the mTOR pathway are currently being studied as potential therapeutic targets in other diseases. Our analysis suggests that this pathway might also be relevant in the design of diagnostic tools and medications targeting IC. • To enhance our understanding of the interstitial cystitis urine biomarker antiproliferative factor (APF), as well as interstitial cystitis biology more generally at the systems level, we reanalyzed recently published large-scale quantitative proteomics and in vivo transcriptomics data sets using an integration analysis tool that we have developed. • To

  6. Integrating Planning, Assessment, and Improvement in Higher Education

    ERIC Educational Resources Information Center

    Sherlock, Barbara J.

    2009-01-01

    Based on Penn State's popular "Innovation Insights" series, this book brings together in one handy reference nearly a decade of tried and true insights into continuous quality improvements in higher education. Their five-step model for integrating planning, assessment, and improvement moves plans off the shelf and into the weekly and daily…

  7. Impacts Assessment of Integrated Dynamic Transit Operations : Final Report

    DOT National Transportation Integrated Search

    2016-03-02

    This document details the impact assessment conducted by the Volpe Center for the Integrated Dynamic Transit Operations (IDTO) prototype demonstrations in Columbus, Ohio and Central Florida. The prototype is one result of the U.S. Department of Trans...

  8. Five Centers Model: Integrated Labs for Instructional Technology and Student Assessment.

    ERIC Educational Resources Information Center

    Burnett, Henry J.

    1989-01-01

    Describes the College of the Desert (California's) integrated facility that links the following three functions: (1) student assessment and computerized placement testing; (2) remedial instruction in study skills, writing, and math; and (3) research to assess learning gains. (DMM)

  9. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  10. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease.

    PubMed

    Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J

    2017-06-01

    Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of

  11. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  12. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  13. AntiJen: a quantitative immunology database integrating functional, thermodynamic, kinetic, biophysical, and cellular data

    PubMed Central

    Toseland, Christopher P; Clayton, Debra J; McSparron, Helen; Hemsley, Shelley L; Blythe, Martin J; Paine, Kelly; Doytchinova, Irini A; Guan, Pingping; Hattotuwagama, Channa K; Flower, Darren R

    2005-01-01

    AntiJen is a database system focused on the integration of kinetic, thermodynamic, functional, and cellular data within the context of immunology and vaccinology. Compared to its progenitor JenPep, the interface has been completely rewritten and redesigned and now offers a wider variety of search methods, including a nucleotide and a peptide BLAST search. In terms of data archived, AntiJen has a richer and more complete breadth, depth, and scope, and this has seen the database increase to over 31,000 entries. AntiJen provides the most complete and up-to-date dataset of its kind. While AntiJen v2.0 retains a focus on both T cell and B cell epitopes, its greatest novelty is the archiving of continuous quantitative data on a variety of immunological molecular interactions. This includes thermodynamic and kinetic measures of peptide binding to TAP and the Major Histocompatibility Complex (MHC), peptide-MHC complexes binding to T cell receptors, antibodies binding to protein antigens and general immunological protein-protein interactions. The database also contains quantitative specificity data from position-specific peptide libraries and biophysical data, in the form of diffusion co-efficients and cell surface copy numbers, on MHCs and other immunological molecules. The uses of AntiJen include the design of vaccines and diagnostics, such as tetramers, and other laboratory reagents, as well as helping parameterize the bioinformatic or mathematical in silico modeling of the immune system. The database is accessible from the URL: . PMID:16305757

  14. Assessment of Online Discussion in Work-Integrated Learning

    ERIC Educational Resources Information Center

    McNamara, Judith; Brown, Catherine

    2009-01-01

    Purpose: The purpose of this paper is to examine how online discussion can be used in work-integrated learning as a vehicle for students to demonstrate their learning in the workplace and to facilitate collaborative learning where face-to-face classes are not feasible. Design/methodology/approach: The paper evaluates the use of assessable online…

  15. Quantitative cervical vertebral maturation assessment in adolescents with normal occlusion: a mixed longitudinal study.

    PubMed

    Chen, Li-Li; Xu, Tian-Min; Jiang, Jiu-Hui; Zhang, Xing-Zhong; Lin, Jiu-Xiang

    2008-12-01

    The purpose of this study was to establish a quantitative cervical vertebral maturation (CVM) system for adolescents with normal occlusion. Mixed longitudinal data were used. The subjects included 87 children and adolescents from 8 to 18 years old with normal occlusion (32 boys, 55 girls) selected from 901 candidates. Sequential lateral cephalograms and hand-wrist films were taken once a year for 6 years. The lateral cephalograms of all subjects were divided into 11 maturation groups according to the Fishman skeletal maturity indicators. The morphologic characteristics of the second, third, and fourth cervical vertebrae at 11 developmental stages were measured and analyzed. Three characteristic parameters (H4/W4, AH3/PH3, @2) were selected to determine the classification of CVM. With 3 morphologic variables, the quantitative CVM system including 4 maturational stages was established. An equation that can accurately estimate the maturation of the cervical vertebrae was established: CVM stage=-4.13+3.57xH4/W4+4.07xAH3/PH3+0.03x@2. The quantitative CVM method is an efficient, objective, and relatively simple approach to assess the level of skeletal maturation during adolescence.

  16. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED, MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR FRAMES-3MRA

    EPA Science Inventory

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...

  17. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  18. Towards an integrated approach to natural hazards risk assessment using GIS: with reference to bushfires.

    PubMed

    Chen, Keping; Blong, Russell; Jacobson, Carol

    2003-04-01

    This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.

  19. The development of an integrated assessment instrument for measuring analytical thinking and science process skills

    NASA Astrophysics Data System (ADS)

    Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta

    2017-05-01

    This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.

  20. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  1. Agreement between quantitative microbial risk assessment and epidemiology at low doses during waterborne outbreaks of protozoan disease

    USDA-ARS?s Scientific Manuscript database

    Quantitative microbial risk assessment (QMRA) is a valuable complement to epidemiology for understanding the health impacts of waterborne pathogens. The approach works by extrapolating available data in two ways. First, dose-response data are typically extrapolated from feeding studies, which use ...

  2. Integrating Academic and Vocational Education: Guidelines for Assessing a Fuzzy Reform.

    ERIC Educational Resources Information Center

    Stasz, Cathy; Grubb, W. Norton

    The 1990 amendments to the Carl D. Perkins Vocational Education Act of 1984 require the National Assessment of Vocational Education (NAVE) to evaluate integration of academic and vocational education. NAVE's study has three integration goals: (1) to examine the themes and research issues; (2) to identify data and data gaps; and (3) to address…

  3. Stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA) for quantitative nanoscale assessment of spatial protein organization.

    PubMed

    Veeraraghavan, Rengasayee; Gourdie, Robert G

    2016-11-07

    The spatial association between proteins is crucial to understanding how they function in biological systems. Colocalization analysis of fluorescence microscopy images is widely used to assess this. However, colocalization analysis performed on two-dimensional images with diffraction-limited resolution merely indicates that the proteins are within 200-300 nm of each other in the xy-plane and within 500-700 nm of each other along the z-axis. Here we demonstrate a novel three-dimensional quantitative analysis applicable to single-molecule positional data: stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA). This method offers significant advantages: 1) STORM imaging affords 20-nm resolution in the xy-plane and <50 nm along the z-axis; 2) STORM-RLA provides a quantitative assessment of the frequency and degree of overlap between clusters of colabeled proteins; and 3) STORM-RLA also calculates the precise distances between both overlapping and nonoverlapping clusters in three dimensions. Thus STORM-RLA represents a significant advance in the high-throughput quantitative assessment of the spatial organization of proteins. © 2016 Veeraraghavan and Gourdie. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  4. Better Assessment Science Integrating Point and Non-point Sources (BASINS)

    EPA Pesticide Factsheets

    Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.

  5. Integrating human behaviour dynamics into flood disaster risk assessment

    NASA Astrophysics Data System (ADS)

    Aerts, J. C. J. H.; Botzen, W. J.; Clarke, K. C.; Cutter, S. L.; Hall, J. W.; Merz, B.; Michel-Kerjan, E.; Mysiak, J.; Surminski, S.; Kunreuther, H.

    2018-03-01

    The behaviour of individuals, businesses, and government entities before, during, and immediately after a disaster can dramatically affect the impact and recovery time. However, existing risk-assessment methods rarely include this critical factor. In this Perspective, we show why this is a concern, and demonstrate that although initial efforts have inevitably represented human behaviour in limited terms, innovations in flood-risk assessment that integrate societal behaviour and behavioural adaptation dynamics into such quantifications may lead to more accurate characterization of risks and improved assessment of the effectiveness of risk-management strategies and investments. Such multidisciplinary approaches can inform flood-risk management policy development.

  6. Recent development in low-constraint fracture toughness testing for structural integrity assessment of pipelines

    NASA Astrophysics Data System (ADS)

    Kang, Jidong; Gianetto, James A.; Tyson, William R.

    2018-03-01

    Fracture toughness measurement is an integral part of structural integrity assessment of pipelines. Traditionally, a single-edge-notched bend (SE(B)) specimen with a deep crack is recommended in many existing pipeline structural integrity assessment procedures. Such a test provides high constraint and therefore conservative fracture toughness results. However, for girth welds in service, defects are usually subjected to primarily tensile loading where the constraint is usually much lower than in the three-point bend case. Moreover, there is increasing use of strain-based design of pipelines that allows applied strains above yield. Low-constraint toughness tests represent more realistic loading conditions for girth weld defects, and the corresponding increased toughness can minimize unnecessary conservatism in assessments. In this review, we present recent developments in low-constraint fracture toughness testing, specifically using single-edgenotched tension specimens, SENT or SE(T). We focus our review on the test procedure development and automation, round-robin test results and some common concerns such as the effect of crack tip, crack size monitoring techniques, and testing at low temperatures. Examples are also given of the integration of fracture toughness data from SE(T) tests into structural integrity assessment.

  7. Integrated assessment and consultation for the preoperative patient.

    PubMed

    Silverman, David G; Rosenbaum, Stanley H

    2009-12-01

    Assessment of the presurgical patient requires interdisciplinary cooperation over the continuum of documentation and optimization of existing disorders, determination of patient resilience and reserve, and planning for subsequent interventions and care. For many patients, evident or suspected morbidities or anticipated surgical disturbance warrant specialty consultation. There may be uncertainty as to the optimal processes for a given patient, a limitation attributable to myriad factors, not the least of which is that there is often a paucity of evidence that is directly relevant to a given patient in a given setting. The present article discusses these limitations and describes a framework for documentation, optimization, risk assessment, and planning, as well as a uniform grading of existing morbidities and anticipated perioperative disturbances for patients requiring integrated assessment and consultation.

  8. Integrated assessment and consultation for the preoperative patient.

    PubMed

    Silverman, David G; Rosenbaum, Stanley H

    2009-09-01

    Assessment of the presurgical patient requires interdisciplinary cooperation over the continuum of documentation and optimization of existing disorders, determination of patient resilience and reserve, and planning for subsequent interventions and care. For many patients, evident or suspected morbidities or anticipated surgical disturbance warrant specialty consultation. There may be uncertainty as to the optimal processes for a given patient, a limitation attributable to myriad factors, not the least of which is that there is often a paucity of evidence that is directly relevant to a given patient in a given setting. The present article discusses these limitations and describes a framework for documentation, optimization, risk assessment, and planning, as well as a uniform grading of existing morbidities and anticipated perioperative disturbances for patients requiring integrated assessment and consultation.

  9. Quantitative assessment of ischemia and reactive hyperemia of the dermal layers using multi - spectral imaging on the human arm

    NASA Astrophysics Data System (ADS)

    Kainerstorfer, Jana M.; Amyot, Franck; Demos, Stavros G.; Hassan, Moinuddin; Chernomordik, Victor; Hitzenberger, Christoph K.; Gandjbakhche, Amir H.; Riley, Jason D.

    2009-07-01

    Quantitative assessment of skin chromophores in a non-invasive fashion is often desirable. Especially pixel wise assessment of blood volume and blood oxygenation is beneficial for improved diagnostics. We utilized a multi-spectral imaging system for acquiring diffuse reflectance images of healthy volunteers' lower forearm. Ischemia and reactive hyperemia was introduced by occluding the upper arm with a pressure cuff for 5min with 180mmHg. Multi-spectral images were taken every 30s, before, during and after occlusion. Image reconstruction for blood volume and blood oxygenation was performed, using a two layered skin model. As the images were taken in a non-contact way, strong artifacts related to the shape (curvature) of the arms were observed, making reconstruction of optical / physiological parameters highly inaccurate. We developed a curvature correction method, which is based on extracting the curvature directly from the intensity images acquired and does not require any additional measures on the object imaged. The effectiveness of the algorithm was demonstrated, on reconstruction results of blood volume and blood oxygenation for in vivo data during occlusion of the arm. Pixel wise assessment of blood volume and blood oxygenation was made possible over the entire image area and comparison of occlusion effects between veins and surrounding skin was performed. Induced ischemia during occlusion and reactive hyperemia afterwards was observed and quantitatively assessed. Furthermore, the influence of epidermal thickness on reconstruction results was evaluated and the exact knowledge of this parameter for fully quantitative assessment was pointed out.

  10. An integrated assessment of soil erosion dynamics with special emphasis on gully erosion: Case studies from South Africa and Iran

    NASA Astrophysics Data System (ADS)

    Maerker, Michael; Sommer, Christian; Zakerinejad, Reza; Cama, Elena

    2017-04-01

    Soil erosion by water is a significant problem in arid and semi arid areas of large parts of Iran. Water erosion is one of the most effective phenomena that leads to decreasing soil productivity and pollution of water resources. Especially in semiarid areas like in the Mazayjan watershed in the Southwestern Fars province as well as in the Mkomazi catchment in Kwa Zulu Natal, South Africa, gully erosion contributes to the sediment dynamics in a significant way. Consequently, the intention of this research is to identify the different types of soil erosion processes acting in the area with a stochastic approach and to assess the process dynamics in an integrative way. Therefore, we applied GIS, and satellite image analysis techniques to derive input information for the numeric models. For sheet and rill erosion the Unit Stream Power-based Erosion Deposition Model (USPED) was utilized. The spatial distribution of gully erosion was assessed using a statistical approach which used three variables (stream power index, slope, and flow accumulation) to predict the spatial distribution of gullies in the study area. The eroded gully volumes were estimated for a multiple years period by fieldwork and Google Earth high resolution images as well as with structure for motion algorithm. Finally, the gully retreat rates were integrated into the USPED model. The results show that the integration of the SPI approach to quantify gully erosion with the USPED model is a suitable method to qualitatively and quantitatively assess water erosion processes in data scarce areas. The application of GIS and stochastic model approaches to spatialize the USPED model input yield valuable results for the prediction of soil erosion in the test areas. The results of this research help to develop an appropriate management of soil and water resources in the study areas.

  11. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  12. An Integrated Decision Support System with Hydrological Processes and Socio-economic Assessments

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Disse, Markus; Yu, Ruide

    2017-04-01

    . DSS is the main outcome of SuMaRiO. The overall goal of the DSS is to integrate all crucial research results of SuMaRiO, also including stakeholder perspectives, into a model based decision support system, which allows a Sustainability Impact Assessment (SIA) within regional planning. This SIA will take into account the perspectives of all relevant actors in the problem field of land and water management in the Tarim River Basin, to understand ecosystem services (ESS) and integrating them into land and water management. Under scenario assumptions, possible actions and their impacts are estimated in a semi-quantitative way with the help of sustainable indicators, which includes climate indicators, socio-economic Indicators, management Indicators, and ESS Indicators. A user-friendly graphical user interface (GUI) was developed to assist the decision-makers and common users, with Chinese and English versions available at the moment.

  13. Assessing the Impacts of Forests on Human Welfare: Prelimnary Results from the Mid-Atlantic Integrated Assessement

    Treesearch

    D. Evan Mercer; P.B. Aruna

    2000-01-01

    Abstract. This paper presents results from the first phase of the socio-economic assessment of forest ecosystems in the Mid-Atlantic Integrated Assessment (MAIA). First, we present results of the analysis of changes in the distribution of human population and forest land use in the region. Then, trends in wood products employment and income between...

  14. Faculty Engagement with Integrative Assignment Design: Connecting Teaching and Assessment

    ERIC Educational Resources Information Center

    Green, Kimberly; Hutchings, Pat

    2018-01-01

    Building on an initiative of the National Institute for Learning Outcomes Assessment, Washington State University faculty have worked to develop more effective integrative capstone assignments in ways that support ongoing improvement.

  15. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  16. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities

    PubMed Central

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832

  17. Health Impacts of Increased Physical Activity from Changes in Transportation Infrastructure: Quantitative Estimates for Three Communities.

    PubMed

    Mansfield, Theodore J; MacDonald Gibson, Jacqueline

    2015-01-01

    Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.

  18. Integrated assessment of urban drainage system under the framework of uncertainty analysis.

    PubMed

    Dong, X; Chen, J; Zeng, S; Zhao, D

    2008-01-01

    Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.

  19. Cross-sectional evaluation of electrical impedance myography and quantitative ultrasound for the assessment of Duchenne muscular dystrophy in a clinical trial setting.

    PubMed

    Rutkove, Seward B; Geisbush, Tom R; Mijailovic, Aleksandar; Shklyar, Irina; Pasternak, Amy; Visyak, Nicole; Wu, Jim S; Zaidman, Craig; Darras, Basil T

    2014-07-01

    Electrical impedance myography and quantitative ultrasound are two noninvasive, painless, and effort-independent approaches for assessing neuromuscular disease. Both techniques have potential to serve as useful biomarkers in clinical trials in Duchenne muscular dystrophy. However, their comparative sensitivity to disease status and how they relate to one another are unknown. We performed a cross-sectional analysis of electrical impedance myography and quantitative ultrasound in 24 healthy boys and 24 with Duchenne muscular dystrophy, aged 2 to 14 years with trained research assistants performing all measurements. Three upper and three lower extremity muscles were studied unilaterally in each child, and the data averaged for each individual. Both electrical impedance myography and quantitative ultrasound differentiated healthy boys from those with Duchenne muscular dystrophy (P < 0.001 for both). Quantitative ultrasound values correlated with age in Duchenne muscular dystrophy boys (rho = 0.45; P = 0.029), whereas electrical impedance myography did not (rho = -0.31; P = 0.14). However, electrical impedance myography phase correlated with age in healthy boys (rho = 0.51; P = 0.012), whereas quantitative ultrasound did not (rho = -0.021; P = 0.92). In Duchenne muscular dystrophy boys, electrical impedance myography phase correlated with the North Star Ambulatory Assessment (rho = 0.65; P = 0.022); quantitative ultrasound revealed a near-significant association (rho = -0.56; P = 0.060). The two technologies trended toward a moderate correlation with one another in the Duchenne muscular dystrophy cohort but not in the healthy group (rho = -0.40; P = 0.054 and rho = -0.32; P = 0.13, respectively). Electrical impedance myography and quantitative ultrasound are complementary modalities for the assessment of boys with Duchenne muscular dystrophy; further study and application of these two modalities alone or in combination in a longitudinal fashion are warranted. Copyright

  20. Review of Multi-Criteria Decision Aid for Integrated Sustainability Assessment of Urban Water Systems - MCEARD

    EPA Science Inventory

    Integrated sustainability assessment is part of a new paradigm for urban water decision making. Multi-criteria decision aid (MCDA) is an integrative framework used in urban water sustainability assessment, which has a particular focus on utilising stakeholder participation. Here ...

  1. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  2. Integrative Analysis of Subcellular Quantitative Proteomics Studies Reveals Functional Cytoskeleton Membrane-Lipid Raft Interactions in Cancer.

    PubMed

    Shah, Anup D; Inder, Kerry L; Shah, Alok K; Cristino, Alexandre S; McKie, Arthur B; Gabra, Hani; Davis, Melissa J; Hill, Michelle M

    2016-10-07

    Lipid rafts are dynamic membrane microdomains that orchestrate molecular interactions and are implicated in cancer development. To understand the functions of lipid rafts in cancer, we performed an integrated analysis of quantitative lipid raft proteomics data sets modeling progression in breast cancer, melanoma, and renal cell carcinoma. This analysis revealed that cancer development is associated with increased membrane raft-cytoskeleton interactions, with ∼40% of elevated lipid raft proteins being cytoskeletal components. Previous studies suggest a potential functional role for the raft-cytoskeleton in the action of the putative tumor suppressors PTRF/Cavin-1 and Merlin. To extend the observation, we examined lipid raft proteome modulation by an unrelated tumor suppressor opioid binding protein cell-adhesion molecule (OPCML) in ovarian cancer SKOV3 cells. In agreement with the other model systems, quantitative proteomics revealed that 39% of OPCML-depleted lipid raft proteins are cytoskeletal components, with microfilaments and intermediate filaments specifically down-regulated. Furthermore, protein-protein interaction network and simulation analysis showed significantly higher interactions among cancer raft proteins compared with general human raft proteins. Collectively, these results suggest increased cytoskeleton-mediated stabilization of lipid raft domains with greater molecular interactions as a common, functional, and reversible feature of cancer cells.

  3. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  4. An Integrated Approach to Risk Assessment for Concurrent Design

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve

    2005-01-01

    This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.

  5. Use of quantitative light-induced fluorescence to monitor tooth whitening

    NASA Astrophysics Data System (ADS)

    Amaechi, Bennett T.; Higham, Susan M.

    2001-04-01

    The changing of tooth shade by whitening agents occurs gradually. Apart from being subjective and affected by the conditions of the surroundings, visual observation cannot detect a very slight change in tooth color. An electronic method, which can communicate the color change quantitatively, would be more reliable. Quantitative Light- induced Fluorescence (QLF) was developed to detect and assess dental caries based on the phenomenon of change of autofluorescence of a tooth by demineralization. However, stains on the tooth surface exhibit the same phenomenon, and therefore QLF can be used to measure the percentage fluorescence change of stained enamel with respect to surrounding unstained enamel. The present study described a technique of assessing the effect of a tooth-whitening agent using QLF. This was demonstrated in two experiments in which either wholly or partially stained teeth were whitened by intermittent immersion in sodium hypochlorite. Following each immersion, the integrated fluorescence change due to the stain was quantified using QLF. In either situation, the value of (Delta) Q decreased linearly as the tooth regained its natural shade. It was concluded that gradual changing of the shade of discolored teeth by a whitening agent could be quantified using QLF.

  6. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  7. Integrated standardization concept for Angelica botanicals using quantitative NMR

    PubMed Central

    Gödecke, Tanja; Yao, Ping; Napolitano, José G.; Nikolić, Dejan; Dietz, Birgit M.; Bolton, Judy L.; van Breemen, Richard B.; Farnsworth, Norman R.; Chen, Shao-Nong; Lankin, David C.; Pauli, Guido F.

    2011-01-01

    Despite numerous in vitro/vivo and phytochemical studies, the active constituents of Angelica sinensis (AS) have not been conclusively identified for the standardization to bioactive markers. Phytochemical analyses of AS extracts and fractions that demonstrate activity in a panel of in vitro bioassays, have repeatedly pointed to ligustilide as being (associated with) the active principle(s). Due to the chemical instability of ligustilide and related issues in GC/LC analyses, new methods capable of quantifying ligustilide in mixtures that do not rely on an identical reference standard are in high demand. This study demonstrates how NMR can satisfy the requirement for simultaneous, multi-target quantification and qualitative identification. First, the AS activity was concentrated into a single fraction by RP-solid-phase extraction, as confirmed by an (anti-)estrogenicity and cytotoxicity assay. Next, a quantitative 1H NMR (qHNMR) method was established and validated using standard compounds and comparing processing methods. Subsequent 1D/2D NMR and qHNMR analysis led to the identification and quantification of ligustilide and other minor components in the active fraction, and to the development of quality criteria for authentic AS preparations. The absolute and relative quantities of ligustilide, six minor alkyl phthalides, and groups of phenylpropanoids, polyynes, and poly-unsaturated fatty acids were measured by a combination of qHNMR and 2D COSY. The qNMR approach enables multi-target quality control of the bioactive fraction, and enables the integrated biological and chemical standardization of AS botanicals. This methodology can potentially be transferred to other botanicals with active principles that act synergistically, or that contain closely related and/or constituents, which have not been conclusively identified as the active principles. PMID:21907766

  8. Integrating human and ecological risk assessment: application to the cyanobacterial harmful algal bloom problem.

    PubMed

    Orme-Zavaleta, Jennifer; Munns, Wayne R

    2008-01-01

    Environmental and public health policy continues to evolve in response to new and complex social, economic and environmental drivers. Globalization and centralization of commerce, evolving patterns of land use (e.g., urbanization, deforestation), and technological advances in such areas as manufacturing and development of genetically modified foods have created new and complex classes of stressors and risks (e.g., climate change, emergent and opportunist disease, sprawl, genomic change). In recognition of these changes, environmental risk assessment and its use are changing from stressor-endpoint specific assessments used in command and control types of decisions to an integrated approach for application in community-based decisions. As a result, the process of risk assessment and supporting risk analyses are evolving to characterize the human-environment relationship. Integrating risk paradigms combine the process of risk estimation for humans, biota, and natural resources into one assessment to improve the information used in environmental decisions (Suter et al. 2003b). A benefit to this approach includes a broader, system-wide evaluation that considers the interacting effects of stressors on humans and the environment, as well the interactions between these entities. To improve our understanding of the linkages within complex systems, risk assessors will need to rely on a suite of techniques for conducting rigorous analyses characterizing the exposure and effects relationships between stressors and biological receptors. Many of the analytical techniques routinely employed are narrowly focused and unable to address the complexities of an integrated assessment. In this paper, we describe an approach to integrated risk assessment, and discuss qualitative community modeling and Probabilistic Relational Modeling techniques that address these limitations and evaluate their potential for use in an integrated risk assessment of cyanobacteria.

  9. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    PubMed

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P < .001). There were no significant differences among LLC, T2-weighted short inversion time inversion recovery (STIR) sequences, early (EGE), and late (LGE) gadolinium-enhancement sequences for diagnosis of AM. The AUC for qualitative (T2-weighted STIR 0.92, EGE 0.87 and LGE 0.88) and quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  10. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... on the sources of L. monocytogenes contamination, the effects of individual manufacturing and/or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1182] Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

  11. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  12. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  13. Integrated computational model of the bioenergetics of isolated lung mitochondria

    PubMed Central

    Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from

  14. Integrated computational model of the bioenergetics of isolated lung mitochondria.

    PubMed

    Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H

    2018-01-01

    Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from

  15. Qualitative and Quantitative Distinctions in Personality Disorder

    PubMed Central

    Wright, Aidan G. C.

    2011-01-01

    The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676

  16. Plasmodium knowlesi transmission: integrating quantitative approaches from epidemiology and ecology to understand malaria as a zoonosis.

    PubMed

    Brock, P M; Fornace, K M; Parmiter, M; Cox, J; Drakeley, C J; Ferguson, H M; Kao, R R

    2016-04-01

    The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.

  17. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  18. Ozone (O3) Standards - Integrated Science Assessments from Review Completed in 2015

    EPA Pesticide Factsheets

    The integrated science assessment (ISA) is a comprehensive review, synthesis, and evaluation of the most policy-relevant science, including key science judgments that are important to inform the development of the risk and exposure assessments, and more.

  19. Laparoscopic training using a quantitative assessment and instructional system.

    PubMed

    Yamaguchi, T; Nakamura, R

    2018-04-28

    Laparoscopic surgery requires complex surgical skills; hence, surgeons require regular training to improve their surgical techniques. The quantitative assessment of a surgeon's skills and the provision of feedback are important processes for conducting effective training. The aim of this study was to develop an inexpensive training system that provides automatic technique evaluation and feedback. We detected the instrument using image processing of commercial web camera images and calculated the motion analysis parameters (MAPs) of the instrument to quantify performance features. Upon receiving the results, we developed a method of evaluating the surgeon's skill level. The feedback system was developed using MAPs-based radar charts and scores for determining the skill level. These methods were evaluated using the videos of 38 surgeons performing a suturing task. There were significant differences in MAPs among surgeons; therefore, MAPs can be effectively used to quantify a surgeon's performance features. The results of skill evaluation and feedback differed greatly between skilled and unskilled surgeons, and it was possible to indicate points of improvement for the procedure performed in this study. Furthermore, the results obtained for certain novice surgeons were similar to those obtained for skilled surgeons. This system can be used to assess the skill level of surgeons, independent of the years of experience, and provide an understanding of the individual's current surgical skill level effectively. We conclude that our system is useful as an inexpensive laparoscopic training system that might aid in skill improvement.

  20. Integrating Outcomes Assessment into Optometry Education: A Strategic Guide for Enhancing Student Learning.

    ERIC Educational Resources Information Center

    Beck, Diane E.; Daum, Kent M.

    2003-01-01

    Outlines eight steps that will help optometry schools transition a faculty from "denial" of the need for assessment to "institutionalization": establish a collaborative environment, establish an infrastructure that makes assessment an integral activity, recruit a leader for full implementation of outcomes assessment, conduct a needs assessment,…

  1. A quantitative study of nanoparticle skin penetration with interactive segmentation.

    PubMed

    Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook

    2016-10-01

    In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.

  2. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  3. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  4. FRAMEWORK FOR THE INTEGRATION OF HEALTH AND ECOLOIGCAL RISK ASSESSMENT

    EPA Science Inventory

    The World Health Organization's International Programme on Chemical Safety (IPCS), the Organization for Economic Cooperation and Development (OECD), and the U.S. Environmental Protection Agency have developed a collaborative partnership to foster integration; of assessment approa...

  5. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  6. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  7. "Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).

    PubMed

    Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara

    2018-03-28

    The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy

  8. Quantitative Assessment of Motor and Sensory/Motor Acquisition in Handicapped and Nonhandicapped Infants and Young Children. Volume IV: Application of the Procedures.

    ERIC Educational Resources Information Center

    Guess, Doug; And Others

    Three studies that applied quantitative procedures to measure motor and sensory/motor acquisition among handicapped and nonhandicapped infants and children are presented. In addition, a study concerning the replication of the quantitative procedures for assessing rolling behavior is described in a fourth article. The first study, by C. Janssen,…

  9. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  10. [Assessment of the impacts of soil erosion on water environment based on the integration of soil erosion process and landscape pattern].

    PubMed

    Liu, Yu; Wu, Bing-Fang; Zeng, Yuan; Zhang, Lei

    2013-09-01

    The integration of the effects of landscape pattern to the assessment of the impacts of soil erosion on eco-environmental is of practical significance in methodological prospect, being able to provide an approach for identifying water body's sediment source area, assessing the potential risks of sediment export of on-site soil erosion to the target water body, and evaluating the capacity of regional landscape pattern in preventing soil loss. In this paper, the RUSLE model was applied to simulate the on-site soil erosion rate. With the consideration of the soil retention potential of vegetation cover and topography, a quantitative assessment was conducted on the impacts of soil erosion in the water source region of the middle route for South-to-North Water Transfer Project on rivers and reservoirs by delineating landscape pattern at point (or cell) scale and sub-watershed level. At point (or grid cell) scale, the index of soil erosion impact intensity (I) was developed as an indicator of the potential risk of sediment export to the water bodies. At sub-watershed level, the landscape leakiness index (LI) was employed to indicate the sediment retention capacity of a given landscape pattern. The results revealed that integrating the information of landscape pattern and the indices of soil erosion process could spatially effectively reflect the impact intensity of in situ soil erosion on water bodies. The LI was significantly exponentially correlated to the mean sediment retention capacity of landscape and the mean vegetation coverage of watershed, and the sediment yield at sub-watershed scale was significantly correlated to the LI in an exponential regression. It could be concluded that the approach of delineating landscape pattern based on soil erosion process and the integration of the information of landscape pattern with its soil retention potential could provide a new approach for the risk evaluation of soil erosion.

  11. The development of integrated diabetes care in the Netherlands: a multiplayer self-assessment analysis.

    PubMed

    Zonneveld, Nick; Vat, Lidewij E; Vlek, Hans; Minkman, Mirella M N

    2017-03-21

    Since recent years Dutch diabetes care has increasingly focused on improving the quality of care by introducing the concept of care groups (in Dutch: 'zorggroepen'), care pathways and improving cooperation with involved care professionals and patients. This study examined how participating actors in care groups assess the development of their diabetes services and the differences and similarities between different stakeholder groups. A self-evaluation study was performed within 36 diabetes care groups in the Netherlands. A web-based self-assessment instrument, based on the Development Model for Integrated Care (DMIC), was used to collect data among stakeholders of each care group. The DMIC defines nine clusters of integrated care and four phases of development. Statistical analysis was used to analyze the data. Respondents indicated that the diabetes care groups work together in well-organized multidisciplinary teams and there is clarity about one another's expertise, roles and tasks. The care groups can still develop on elements related to the management and monitoring of performance, quality of care and patient-centeredness. The results show differences (p < 0.01) between three stakeholders groups in how they assess their integrated care services; (1) core players, (2) managers/directors/coordinators and (3) players at a distance. Managers, directors and coordinators assessed more implemented integrated care activities than the other two stakeholder groups. This stakeholder group also placed their care groups in a further phase of development. Players at a distance assessed significantly less present elements and assessed their care group as less developed. The results show a significant difference between stakeholder groups in the assessment of diabetes care practices. This reflects that the professional disciplines and the roles of stakeholders influence the way they asses the development of their integrated care setting, or that certain stakeholder groups

  12. [Integrated health information system based on Resident Assessment Instruments].

    PubMed

    Frijters, D; Achterberg, W; Hirdes, J P; Fries, B E; Morris, J N; Steel, K

    2001-02-01

    The paper explores the meaning of Resident Assessment Instruments. It gives a summary of existing RAI instruments and derived applications. It argues how all of these form the basis for an integrated health information system for "chain care" (home care, home for the elderly care, nursing home care, mental health care and acute care). The primary application of RAI systems is the assessment of client care needs, followed by an analysis of the required and administered care with the objective to make an optimal individual care plan. On the basis of RAI, however, applications have been derived for reimbursement systems, quality improvement programs, accreditation, benchmarking, best practice comparison and care eligibility systems. These applications have become possible by the development on the basis of the Minimum Data Set of RAI of outcome measures (item scores, scales and indices), case-mix classifications and quality indicators. To illustrate the possibilities of outcome measures of RAI we present a table and a figure with data of six Dutch nursing homes which shows how social engagement is related to ADL and cognition. We argue that RAI/MDS assessment instruments comprise an integrated health information system because they have consistent terminology, common core items, and a common conceptual basis in a clinical approach that emphasizes the identification of functional problems.

  13. Integrating Quantitative Skills in Introductory Ecology: Investigations of Wild Bird Feeding Preferences

    ERIC Educational Resources Information Center

    Small, Christine J.; Newtoff, Kiersten N.

    2013-01-01

    Undergraduate biology education is undergoing dramatic changes, emphasizing student training in the "tools and practices" of science, particularly quantitative and problem-solving skills. We redesigned a freshman ecology lab to emphasize the importance of scientific inquiry and quantitative reasoning in biology. This multi-week investigation uses…

  14. Integrating Quantitative and Ethnographic Methods to Describe the Classroom. Report No. 5083.

    ERIC Educational Resources Information Center

    Malitz, David; And Others

    The debate between proponents of ethnographic and quantitative methodology in classroom observation is reviewed, and the respective strengths and weaknesses of the two approaches are discussed. These methodologies are directly compared in a study that conducted simultaneous ethnographic and quantitative observations on nine classrooms. It is…

  15. Integrated Science Assessment (ISA) for Lead (Second ...

    EPA Pesticide Factsheets

    EPA has announced that the Second External Review Draft of the Integrated Science Assessment (ISA) for Lead (Pb) has been made available for independent peer review and public review. This draft ISA represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding whether the current standards for Pb sufficiently protect public health and the environment. Lead (Pb) is one of six principal (or criteria) pollutants for which EPA has established NAAQS

  16. Integrated Science Assessment (ISA) for Carbon Monoxide ...

    EPA Pesticide Factsheets

    EPA announced that the First External Review Draft of the Integrated Science Assessment (ISA) for Carbon Monoxide (CO) and related Annexes was made available for independent peer review and public review. This draft ISA document represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA's decision regarding whether the current standards for CO sufficiently protect public health and the environment. The Integrated Plan for Review of the NAAQS for CO {U.S. EPA, 2008 #8615} identifies key policy-relevant questions that provide a framework for this review of the scientific evidence. These questions frame the entire review of the NAAQS, and thus are informed by both science and policy considerations. The ISA organizes and presents the scientific evidence such that it, when considered along with findings from risk analyses and policy considerations, will help the EPA address these questions during the NAAQS review:

  17. Energy use and carbon footprints differ dramatically for diverse wastewater-derived carbonaceous substrates: An integrated exploration of biokinetics and life-cycle assessment.

    PubMed

    Li, Yanbo; Wang, Xu; Butler, David; Liu, Junxin; Qu, Jiuhui

    2017-03-21

    Energy neutrality and reduction of carbon emissions are significant challenges to the enhanced sustainability of wastewater treatment plants (WWTPs). Harvesting energy from wastewater carbonaceous substrates can offset energy demands and enable net power generation; yet, there is limited research about how carbonaceous substrates influence energy and carbon implications of WWTPs with integrated energy recovery at systems-level. Consequently, this research uses biokinetics modelling and life cycle assessment philology to explore this notion, by tracing and assessing the quantitative flows of energy embodied or captured, and by exploring the carbon footprint throughout an energy-intensive activated sludge process with integrated energy recovery facilities. The results indicate that energy use and carbon footprint per cubic meter of wastewater treated, varies markedly with the carbon substrate. Compared with systems driven with proteins, carbohydrates or other short-chain fatty acids, systems fed with acetic acid realized energy neutrality with maximal net gain of power from methane combustion (0.198 kWh) and incineration of residual biosolids (0.153 kWh); and also achieved a negative carbon footprint (72.6 g CO 2 ). The findings from this work help us to better understand and develop new technical schemes for improving the energy efficiency of WWTPs by repurposing the stream of carbon substrates across systems.

  18. Exploring Phytoplankton Population Investigation Growth to Enhance Quantitative Literacy

    ERIC Educational Resources Information Center

    Baumgartner, Erin; Biga, Lindsay; Bledsoe, Karen; Dawson, James; Grammer, Julie; Howard, Ava; Snyder, Jeffrey

    2015-01-01

    Quantitative literacy is essential to biological literacy (and is one of the core concepts in "Vision and Change in Undergraduate Biology Education: A Call to Action"; AAAS 2009). Building quantitative literacy is a challenging endeavor for biology instructors. Integrating mathematical skills into biological investigations can help build…

  19. Integrated national-scale assessment of wildfire risk to human and ecological values

    Treesearch

    Matthew P. Thompson; David E. Calkin; Mark A. Finney; Alan A. Ager; Julie W. Gilbertson-Day

    2011-01-01

    The spatial, temporal, and social dimensions of wildfire risk are challenging U.S. federal land management agencies to meet societal needs while maintaining the health of the lands they manage. In this paper we present a quantitative, geospatial wildfire risk assessment tool, developed in response to demands for improved risk-based decision frameworks. The methodology...

  20. Elementary Writing Assessment Platforms: A Quantitative Examination of Online versus Offline Writing Performance of Fifth-Grade Students

    ERIC Educational Resources Information Center

    Heath, Vickie L.

    2013-01-01

    This quantitative study explored if significant differences exist between how fifth-grade students produce a written response to a narrative prompt using online versus offline writing platforms. The cultural and social trend of instructional and assessment writing paradigms in education is shifting to online writing platforms (National Assessment…

  1. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  2. Closing the Loop: Involving Faculty in the Assessment of Scientific and Quantitative Reasoning Skills of Biology Majors

    ERIC Educational Resources Information Center

    Hurney, Carol A.; Brown, Justin; Griscom, Heather Peckham; Kancler, Erika; Wigtil, Clifton J.; Sundre, Donna

    2011-01-01

    The development of scientific and quantitative reasoning skills in undergraduates majoring in science, technology, engineering, and mathematics (STEM) is an objective of many courses and curricula. The Biology Department at James Madison University (JMU) assesses these essential skills in graduating biology majors by using a multiple-choice exam…

  3. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  4. OMICS DATA IN THE QUALITATIVE AND QUANTITATIVE CHARACTERIZATION OF THE MODE OF ACTION IN SUPPORT OF IRIS ASSESSMENTS

    EPA Science Inventory

    Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.

    The suffix "Omics" is a descriptor used for a series of e...

  5. Quantitative computed tomography assessment of transfusional iron overload.

    PubMed

    Wood, John C; Mo, Ashley; Gera, Aakansha; Koh, Montre; Coates, Thomas; Gilsanz, Vicente

    2011-06-01

    Quantitative computed tomography (QCT) has been proposed for iron quantification for more than 30 years, however there has been little clinical validation. We compared liver attenuation by QCT with magnetic resonance imaging (MRI)-derived estimates of liver iron concentration (LIC) in 37 patients with transfusional siderosis. MRI and QCT measurements were performed as clinically indicated monitoring of LIC and vertebral bone-density respectively, over a 6-year period. Mean time difference between QCT and MRI studies was 14 d, with 25 studies performed on the same day. For liver attenuation outside the normal range, attenuation values rose linearly with LIC (r(2) = 0·94). However, intersubject variability in intrinsic liver attenuation prevented quantitation of LIC <8 mg/g dry weight of liver, and was the dominant source of measurement uncertainty. Calculated QCT and MRI accuracies were equivalent for LIC values approaching 22 mg/g dry weight, with QCT having superior performance at higher LIC's. Although not suitable for monitoring patients with good iron control, QCT may nonetheless represent a viable technique for liver iron quantitation in patients with moderate to severe iron in regions where MRI resources are limited because of its low cost, availability, and high throughput. © 2011 Blackwell Publishing Ltd.

  6. A Scheme for the Integrated Assessment of Mitigation Options

    NASA Astrophysics Data System (ADS)

    Held, H.; Edenhofer, O.

    2003-04-01

    After some consensus has been achieved that the global mean temperature will have increased by 1.4 to 5.8^oC at the end of this century in case of continued ``business as usual'' greenhouse gas emissions, society has to decide if or which mitigation measures should be taken. A new integrated assessment project on this very issue will be started at PIK in spring 2003. The assessment will cover economic aspects as well as potential side effects of various measures. In the economic module, the effects of investment decisions on technological innovation will be explicitly taken into account. Special emphasize will be put on the issue of uncertainty. Hereby we distinguish the uncertainty related to the Integrated Assessment modules, including the economic module, from the fact that no over-complex system can be fully captured by a model. Therefore, a scheme for the assessment of the ``residual'', the non-modelled part of the system, needs to be worked out. The scheme must be truly interdisciplinary, i.e. must be applicable to at least the natural science and the economic aspects. A scheme based on meta-principles like minimum persistence, ubiquity, or irreversibility of potential measures appears to be a promising candidate. An implementation of ubiquity as at present successfully operated in environmental chemistry may serve as a guideline [1]. Here, the best-known mechanism within a complex impact chain of potentially harmful chemicals, their transport, is captured by a reaction-diffusion mechanism [2]. begin{thebibliography}{0} bibitem{s} M. Scheringer, Persistence and spatial range as endpoints of an exposure-based assessment of organic chemicals. Environ. Sci. Technol. 30: 1652-1659 (1996). bibitem{h} H. Held, Robustness of spatial ranges of environmental chemicals with respect to model dimension, accepted for publication in Stoch. Environ. Res. Risk Assessment.

  7. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  8. Critical methodological factors in diagnosing minimal residual disease in hematological malignancies using quantitative PCR.

    PubMed

    Nyvold, Charlotte Guldborg

    2015-05-01

    Hematological malignancies are a heterogeneous group of cancers with respect to both presentation and prognosis, and many subtypes are nowadays associated with aberrations that make up excellent molecular targets for the quantification of minimal residual disease. The quantitative PCR methodology is outstanding in terms of sensitivity, specificity and reproducibility and thus an excellent choice for minimal residual disease assessment. However, the methodology still has pitfalls that should be carefully considered when the technique is integrated in a clinical setting.

  9. Applicability of integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) for the simultaneous detection of the four human enteric enterovirus species in disinfection studies

    EPA Science Inventory

    A newly developed integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) method and its applicability in UV disinfection studies is described. This method utilizes a singular cell culture system coupled with four RTqPCR assays to detect infectious serotypes t...

  10. Quantitative assessment of locomotive syndrome by the loco-check questionnaire in older Japanese females

    PubMed Central

    Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi

    2017-01-01

    [Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003

  11. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved

  12. Towards an integrated environmental risk assessment of emissions from ships' propulsion systems.

    PubMed

    Blasco, Julián; Durán-Grados, Vanesa; Hampel, Miriam; Moreno-Gutiérrez, Juan

    2014-05-01

    Large ships, particularly container ships, tankers, bulk carriers and cruise ships are significant individual contributors to air pollution. The European Environment Agency recognizes that air pollution in Europe is a local, regional and transborder problem caused by the emission of specific pollutants, which either directly or through chemical reactions lead to negative impacts, such as damage to human health and ecosystems. In the Marine Strategy Framework Directive 2008/56/EC of the European Parliament emissions from ships are mentioned explicitly in the list of pressures and impacts that should be reduced or minimized to maintain or obtain a good ecological status. While SOx and NOx contribute mainly to ocean and soil acidification and climate change, PM (particularly ultrafine particles in the range of nanoparticles) has the potential to act more directly on human and ecosystem health. Thus, in terms of risk assessment, one of the most dangerous atmospheric aerosols for environmental and human health is in the size range of nanoparticles. To our knowledge, no study has been carried out on the effects of the fraction that ends up in the water column and to which aquatic and sediment-dwelling organisms are exposed. Therefore, an integrated environmental risk assessment of the effects of emissions from oceangoing ships including the aquatic compartment is necessary. Research should focus on the quantitative and qualitative determination of pollutant emissions from ships and their distribution and fate. This will include the in situ measurement of emissions in ships in order to derive realistic emission factors, and the application of atmospheric and oceanographic transportation and chemistry models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Multisite formative assessment for the Pathways study to prevent obesity in American Indian schoolchildren123

    PubMed Central

    Gittelsohn, Joel; Evans, Marguerite; Story, Mary; Davis, Sally M; Metcalfe, Lauve; Helitzer, Deborah L; Clay, Theresa E

    2016-01-01

    We describe the formative assessment process, using an approach based on social learning theory, for the development of a school-based obesity-prevention intervention into which cultural perspectives are integrated. The feasibility phase of the Pathways study was conducted in multiple settings in 6 American Indian nations. The Pathways formative assessment collected both qualitative and quantitative data. The qualitative data identified key social and environmental issues and enabled local people to express their own needs and views. The quantitative, structured data permitted comparison across sites. Both types of data were integrated by using a conceptual and procedural model. The formative assessment results were used to identify and rank the behavioral risk factors that were to become the focus of the Pathways intervention and to provide guidance on developing common intervention strategies that would be culturally appropriate and acceptable to all sites. PMID:10195601

  14. Efficient Assessment of the Environment for Integral Urban Water Management

    NASA Astrophysics Data System (ADS)

    Rost, Grit; Londong, Jörg

    2015-04-01

    Introduction: Sustainable water supply and sanitation is fundamental, especially in countries that are also particularly vulnerable to water-related problems. The Integrated Water Resources Management (IWRM) approach makes sure that water management is organised in a transdisciplinary way taking into account the river basin, the hydrologic system and the appendant organisation like culture, law and economics. The main objective of IWRM is the sustainable organisation of water resources quality and quantity (GWP and INBO 2009). However there are more important targets in sustainable use of water resources. New sanitation systems are focussing on adding value and maintaining essential resources in circular flow. Focussing on material fluxes can contribute on water quality, food security, sustainable use of renewable energy, adaption on water scarcity and also on rising water and sanitation demand because of rapid urban and suburban growth (Price and Vojinović 2011; Rost et al 2013; Stäudel et al 2014). Problem: There are several planning tools for IWRM as well as for urban water management. But to complete the IWRM approach for the resource oriented concept a systematic assessment tool is missing. The assessment of crucial indicators obviously requires a lot of data from different subjects/disciplines, in different scales of detail and in different accuracy and in data acquisition (Karthe et al 2014). On the one hand there will be data abundance and on the other hand the data can be unavailable or unfeasible for example because of scale and specification(Rost et al 2013). Such a complex integrated concept requires a clearly worked out structure for the way of managing and priority setting. Purpose: To get systematic in the complex planning process the toolbox model is going to develop. The assessment of the environmental screening (one part of the toolbox) is going to be presented in this paper. The first step of assessment leans on the assertion that each of the

  15. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe

  16. Application of an Integrated Assessment Model to the Kevin Dome site, Montana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Minh; Zhang, Ye; Carey, James William

    The objectives of the Integrated Assessment Model is to enable the Fault Swarm algorithm in the National Risk Assessment Partnership, ensure faults are working in the NRAP-IAM tool, calculate hypothetical fault leakage in NRAP-IAM, and compare leakage rates to Eclipse simulations.

  17. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  18. Independent Assessment of Instrumentation for ISS On-Orbit NDE. Volume 1

    NASA Technical Reports Server (NTRS)

    Madaras, Eric I

    2013-01-01

    International Space Station (ISS) Structural and Mechanical Systems Manager, requested that the NASA Engineering and Safety Center (NESC) provide a quantitative assessment of commercially available nondestructive evaluation (NDE) instruments for potential application to the ISS. This work supports risk mitigation as outlined in the ISS Integrated Risk Management Application (IRMA) Watch Item #4669, which addresses the requirement for structural integrity after an ISS pressure wall leak in the event of a penetration due to micrometeoroid or debris (MMOD) impact. This document contains the outcome of the NESC assessment.

  19. The Moment of Learning: Quantitative Analysis of Exemplar Gameplay Supports CyGaMEs Approach to Embedded Assessment

    ERIC Educational Resources Information Center

    Reese, Debbie Denise; Tabachnick, Barbara G.

    2010-01-01

    In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…

  20. Integrated Science Assessment (ISA) for Sulfur Oxides ...

    EPA Pesticide Factsheets

    EPA has announced that the Second External Review Draft of the Integrated Science Assessment (ISA) for Sulfur Oxides – Health Criteria has been made available for independent peer review and public review. This draft ISA document represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA’s decision regarding whether the current standard for SO2 sufficiently protects public health. Sulfur oxides is one of six principal (or “criteria”) pollutants for which EPA has established national ambient air quality standards (NAAQS).