Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)
High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...
A Systematic Quantitative-Qualitative Model: How To Evaluate Professional Services
ERIC Educational Resources Information Center
Yoda, Koji
1973-01-01
The proposed evaluation model provides for the assignment of relative weights to each criterion, and establishes a weighting system for calculating a quantitative-qualitative raw score for each service activity of a faculty member being reviewed. (Author)
[Doppler echocardiography of tricuspid insufficiency. Methods of quantification].
Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P
1994-01-01
Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.
Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board
2017-03-01
due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE
A grid for a precise analysis of daily activities.
Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E
2010-01-01
Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.
An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course
ERIC Educational Resources Information Center
Fletcher, Joseph F.; Painter-Main, Michael A.
2014-01-01
Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…
Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages
Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert
2014-01-01
This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829
Critiquing qualitative research.
Beck, Cheryl Tatano
2009-10-01
The ability to critique research is a valuable skill that is fundamental to a perioperative nurse's ability to base his or her clinical practice on evidence derived from research. Criteria differ for critiquing a quantitative versus a qualitative study (ie, statistics are evaluated in a quantitative study, but not in a qualitative study). This article provides on guidelines for assessing qualitative research. Excerpts from a published qualitative research report are summarized and then critiqued. Questions are provided that help evaluate different sections of a research study (eg, sample, data collection methods, data analysis).
Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.
Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans
2017-01-01
The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Lawrenz, Frances; McCreath, Heather
Qualitative and quantitative evaluation procedures were used to compare two physical-science teacher inservice training programs. The two programs followed the master teacher training model espoused by NSF but used different types of master teachers and types of activities. The two evaluation procedures produced different results and together they provided a much clearer picture of the strengths and weaknesses of the two programs. Using only one approach or the other would have substantially altered the conclusions.
Daigneault, Pierre-Marc
2014-08-01
Stakeholder participation and evaluation use have attracted a lot of attention from practitioners, theorists and researchers. A common hypothesis is that participation is positively associated with evaluation use. Whereas the number of empirical studies conducted on this topic is impressive, quantitative research has held a minority position within this scientific production. This study mobilizes systematic review methods to 'map' the empirical literature that has quantitatively studied participation and use. The goal is to take stock and assess the strength of evidence of this literature (but not to synthesize the findings) and, based on this assessment, to provide directions for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evaluation of Reference Services--A Review
ERIC Educational Resources Information Center
Kuruppu, Pali U.
2007-01-01
Understanding the inherent deficiencies in reference service as provided is critical to providing effective, high quality service. Quantitative and qualitative research methodologies, as well as a combination of both, are being used to evaluate these services. The identification of appropriate research methodology is critical to an effective…
42 CFR 84.303 - General testing conditions and requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...
42 CFR 84.303 - General testing conditions and requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...
42 CFR 84.303 - General testing conditions and requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Closed... quantitative evaluations and human subjects on a treadmill to provide qualitative evaluations. Information on...
Cost and Efficacy Assessment of an Alternative Medication Compliance Urine Drug Testing Strategy.
Doyle, Kelly; Strathmann, Frederick G
2017-02-01
This study investigates the frequency at which quantitative results provide additional clinical benefit compared to qualitative results alone. A comparison between alternative urine drug screens and conventional screens including the assessment of cost-to-payer differences, accuracy of prescription compliance or polypharmacy/substance abuse was also included. In a reference laboratory evaluation of urine specimens from across the United States, 213 urine specimens with provided prescription medication information (302 prescriptions) were analyzed by two testing algorithms: 1) conventional immunoassay screen with subsequent reflexive testing of positive results by quantitative mass spectrometry; and 2) a combined immunoassay/qualitative mass-spectrometry screen that substantially reduced the need for subsequent testing. The qualitative screen was superior to immunoassay with reflex to mass spectrometry in confirming compliance per prescription (226/302 vs 205/302), and identifying non-prescription abuse (97 vs 71). Pharmaceutical impurities and inconsistent drug metabolite patterns were detected in only 3.8% of specimens, suggesting that quantitative results have limited benefit. The percentage difference between the conventional testing algorithm and the alternative screen was projected to be 55%, and a 2-year evaluation of test utilization as a measure of test order volume follows an exponential trend for alternative screen test orders over conventional immunoassay screens that require subsequent confirmation testing. Alternative, qualitative urine drug screens provide a less expensive, faster, and more comprehensive evaluation of patient medication compliance and drug abuse. The vast majority of results were interpretable with qualitative results alone indicating a reduced need to automatically reflex to quantitation or provide quantitation for the majority of patients. This strategy highlights a successful approach using an alternative strategy for both the laboratory and physician to align clinical needs while being mindful of costs.
Zhang, Yin; Wang, Lei
2013-01-01
Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689
Zhang, Yin; Wang, Lei; Diao, Tianxi
2013-12-01
The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.
Noninvasive identification of the total peripheral resistance baroreflex
NASA Technical Reports Server (NTRS)
Mukkamala, Ramakrishna; Toska, Karin; Cohen, Richard J.
2003-01-01
We propose two identification algorithms for quantitating the total peripheral resistance (TPR) baroreflex, an important contributor to short-term arterial blood pressure (ABP) regulation. Each algorithm analyzes beat-to-beat fluctuations in ABP and cardiac output, which may both be obtained noninvasively in humans. For a theoretical evaluation, we applied both algorithms to a realistic cardiovascular model. The results contrasted with only one of the algorithms proving to be reliable. This algorithm was able to track changes in the static gains of both the arterial and cardiopulmonary TPR baroreflex. We then applied both algorithms to a preliminary set of human data and obtained contrasting results much like those obtained from the cardiovascular model, thereby making the theoretical evaluation results more meaningful. This study suggests that, with experimental testing, the reliable identification algorithm may provide a powerful, noninvasive means for quantitating the TPR baroreflex. This study also provides an example of the role that models can play in the development and initial evaluation of algorithms aimed at quantitating important physiological mechanisms.
Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F
2016-08-03
Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.
Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y
2018-04-01
Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping; they were worse on preconditioned quantitative susceptibility mapping. Preconditioned quantitative susceptibility mapping MR imaging can bring the benefits of quantitative susceptibility mapping imaging to clinical practice without the limitations of mask-based quantitative susceptibility mapping, especially for evaluating cerebral microhemorrhage-associated pathologies, such as traumatic brain injury. © 2018 by American Journal of Neuroradiology.
Guidelines for evaluating fish habitat in Wisconsin streams.
Timothy D. Simonson; John Lyons; Paul D. Kanehl
1993-01-01
Describes procedures for evaluating the quality and quantity of habitat for fish in small and medium streams of Wisconsin. Provides detailed guidelines for collecting and analyzing specific quantitative habitat information.
Measures of fish behavior as indicators of sublethal toxicosis during standard toxicity tests
Little, E.E.; DeLonay, A.J.
1996-01-01
Behavioral functions essential for growth and survival can be dramatically altered by sublethal exposure to toxicants. Measures of these behavioral responses are effective in detecting adverse effects of sublethal contaminant exposure. Behavioral responses of fishes can be qualitatively and quantitatively evaluated during routine toxicity tests. At selected intervals of exposure, qualitative evaluations are accomplished through direct observations, whereas video recordings are used for quantitative evaluations. Standardized procedures for behavioral evaluation are readily applicable to different fish species and provide rapid, sensitive, and ecologically relevant assessments of sublethal exposure. The methods are readily applied to standardized test protocols.
Prospects and challenges of quantitative phase imaging in tumor cell biology
NASA Astrophysics Data System (ADS)
Kemper, Björn; Götte, Martin; Greve, Burkhard; Ketelhut, Steffi
2016-03-01
Quantitative phase imaging (QPI) techniques provide high resolution label-free quantitative live cell imaging. Here, prospects and challenges of QPI in tumor cell biology are presented, using the example of digital holographic microscopy (DHM). It is shown that the evaluation of quantitative DHM phase images allows the retrieval of different parameter sets for quantification of cellular motion changes in migration and motility assays that are caused by genetic modifications. Furthermore, we demonstrate simultaneously label-free imaging of cell growth and morphology properties.
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Quantitative Evaluation of Musical Scale Tunings
ERIC Educational Resources Information Center
Hall, Donald E.
1974-01-01
The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)
QUANTITATIVE GENETIC ACTIVITY GRAPHICAL PROFILES FOR USE IN CHEMICAL EVALUATION
A graphic approach termed a Genetic Activity Profile (GAP) has been developed to display a matrix of data on the genetic and related effects of selected chemical agents. he profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each...
The other half of the story: effect size analysis in quantitative research.
Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane
2013-01-01
Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2013-01-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2012-12-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha
2009-02-01
Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.
ERIC Educational Resources Information Center
Petersohn, Sabrina
2016-01-01
Quantitative metrics in research assessment are proliferating all over the world. The demand has led to an increase in bibliometric practitioners and service providers. Their professional roles and competencies have not yet been subject to systematic study. This paper focuses on one important service provider in evaluative bibliometrics--academic…
Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio
2017-01-01
Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698
Breach Risk Magnitude: A Quantitative Measure of Database Security.
Yasnoff, William A
2016-01-01
A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.
ERIC Educational Resources Information Center
Pivik, Jayne R.; Goelman, Hillel
2011-01-01
A process evaluation of a consortium of academic researchers and community-based service providers focused on the health and well-being of children and families provides empirical and practice-based evidence of those factors important for community-based participatory research (CBPR). This study draws on quantitative ratings of 33 factors…
ERIC Educational Resources Information Center
Schochet, Peter Z.; Puma, Mike; Deke, John
2014-01-01
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…
21 CFR 601.12 - Changes to an approved application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... qualitative or quantitative formulation, including inactive ingredients, or in the specifications provided in... Evaluation and Research or the Director, Center for Drug Evaluation and Research, may approve a written... promotional labeling shall be submitted to the Center for Biologics Evaluation and Research or Center for Drug...
21 CFR 601.12 - Changes to an approved application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... qualitative or quantitative formulation, including inactive ingredients, or in the specifications provided in... Evaluation and Research or the Director, Center for Drug Evaluation and Research, may approve a written... promotional labeling shall be submitted to the Center for Biologics Evaluation and Research or Center for Drug...
21 CFR 601.12 - Changes to an approved application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... qualitative or quantitative formulation, including inactive ingredients, or in the specifications provided in... Evaluation and Research or the Director, Center for Drug Evaluation and Research, may approve a written... promotional labeling shall be submitted to the Center for Biologics Evaluation and Research or Center for Drug...
21 CFR 601.12 - Changes to an approved application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... qualitative or quantitative formulation, including inactive ingredients, or in the specifications provided in... Evaluation and Research or the Director, Center for Drug Evaluation and Research, may approve a written... promotional labeling shall be submitted to the Center for Biologics Evaluation and Research or Center for Drug...
21 CFR 601.12 - Changes to an approved application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... qualitative or quantitative formulation, including inactive ingredients, or in the specifications provided in... Evaluation and Research or the Director, Center for Drug Evaluation and Research, may approve a written... promotional labeling shall be submitted to the Center for Biologics Evaluation and Research or Center for Drug...
Evaluation of Instructional Materials for Exceptional Children and Youth: A Preliminary Instrument.
ERIC Educational Resources Information Center
Eash, Maurice
An instrument for the evaluation of instructional materials is presented. Evaluative items are arranged under four constructs: objectives, organization of material (both scope and sequence), methodology, and evaluation. A section is also provided for summary quantitative judgment. A glossary of terms used in the instrument is included. A training…
Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization
NASA Astrophysics Data System (ADS)
Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija
2017-07-01
Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.
Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela
2014-04-22
In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.
Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia
2015-11-03
Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
Abildgaard, Johan S.; Saksvik, Per Ø.; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire (N = 285) as well as an extensive interview study (N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies. PMID:27713707
Abildgaard, Johan S; Saksvik, Per Ø; Nielsen, Karina
2016-01-01
Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire ( N = 285) as well as an extensive interview study ( N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies.
Products of combustion of non-metallic materials
NASA Technical Reports Server (NTRS)
Perry, Cortes L.
1995-01-01
The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.
Chemical analysis and quantitation of the tapetum lucidum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gee, N.A.; Fisher, G.L.; Nash, C.P.
1975-06-01
A study was conducted to provide a basis for the evaluation of the biochemical nature of the $sup 226$Ra alterations of the beagle tapetum. Results indicated that zinc and/or melanin determinations in the tapetum nigrum and tapetum lucidum may allow quantitation of tapetum lucidum tissue without the need for physical separation of the tapetal layers. (HLW)
Manned Mars mission radiation environment and radiobiology
NASA Technical Reports Server (NTRS)
Nachtwey, D. S.
1986-01-01
Potential radiation hazards to crew members on manned Mars missions are discussed. It deals briefly with radiation sources and environments likely to be encountered during various phases of such missions, providing quantitative estimates of these environments. Also provided are quantitative data and discussions on the implications of such radiation on the human body. Various sorts of protective measures are suggested. Recent re-evaluation of allowable dose limits by the National Council of Radiation Protection is discussed, and potential implications from such activity are assessed.
Candy, B; Holman, A; Leurent, B; Davis, S; Jones, L
2011-01-01
Hospice care supports patients and their families physically and emotionally through the dying phase. In many countries a substantial portion of specialised end-of-life care is provided through hospices. Such care has developed outside of general healthcare and is commonly provided in a patient's home or in dedicated facilities. Hospice provision may need to increase in the future due to an ageing population with a greater need for access to end-of-life care. In this systematic review we sought to identify the current evidence on (1) the effectiveness, including cost-effectiveness, of hospices, and hospice care in a patient's home and in nursing homes and (2) the experiences of those who use and of those who provide such services. We included quantitative and qualitative studies on hospice care that was provided in a patient's home, nursing home or hospice. We did not include studies on end-of-life care that was provided as part of general healthcare provision, such as by general practitioners in primary care, community nurses or within general hospitals. For quantitative evaluations we included only those that compared hospice care with usual generalist healthcare. The databases CINAHL, MEDLINE, EMBASE, and The Cochrane Library were searched from 2003 to 2009. Evidence was assessed for quality and data extractions double-checked. For quantitative studies we present the outcome data comparing hospice versus usual care. For qualitative evaluations we organise findings thematically. Eighteen comparative evaluations and four thematic papers were identified. Quantitative evidence, mostly of limited quality in design, showed that hospice care at home reduced general health care use and increased family and patient satisfaction with care. Main themes in the qualitative literature revealed that home hospice services support families to sustain patient care at home and hospice day care services generate for the patient a renewed sense of meaning and purpose. Although studies had methodological limitations, in this review we found much evidence to support the benefits of hospice care. There were limited evaluations found on the impact of hospice care on psychological well-being, such as symptoms of depression, and on inpatient hospice care and non-hospital related costs. Copyright © 2010 Elsevier Ltd. All rights reserved.
Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C
2015-04-13
Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.
Robert E. Keane; Lisa Holsinger; Russell A. Parsons
2011-01-01
A measure of the degree of departure of a landscape from its range of historical conditions can provide a means for prioritizing and planning areas for restoration treatments. There are few statistics or indices that provide a quantitative context for measuring departure across landscapes. This study evaluated a set of five similarity indices commonly used in...
Forage resource evaluation system for habitat—deer: an interactive deer habitat model
Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris
2012-01-01
We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...
Kessler, Larry G; Barnhart, Huiman X; Buckler, Andrew J; Choudhury, Kingshuk Roy; Kondratovich, Marina V; Toledano, Alicia; Guimaraes, Alexander R; Filice, Ross; Zhang, Zheng; Sullivan, Daniel C
2015-02-01
The development and implementation of quantitative imaging biomarkers has been hampered by the inconsistent and often incorrect use of terminology related to these markers. Sponsored by the Radiological Society of North America, an interdisciplinary group of radiologists, statisticians, physicists, and other researchers worked to develop a comprehensive terminology to serve as a foundation for quantitative imaging biomarker claims. Where possible, this working group adapted existing definitions derived from national or international standards bodies rather than invent new definitions for these terms. This terminology also serves as a foundation for the design of studies that evaluate the technical performance of quantitative imaging biomarkers and for studies of algorithms that generate the quantitative imaging biomarkers from clinical scans. This paper provides examples of research studies and quantitative imaging biomarker claims that use terminology consistent with these definitions as well as examples of the rampant confusion in this emerging field. We provide recommendations for appropriate use of quantitative imaging biomarker terminological concepts. It is hoped that this document will assist researchers and regulatory reviewers who examine quantitative imaging biomarkers and will also inform regulatory guidance. More consistent and correct use of terminology could advance regulatory science, improve clinical research, and provide better care for patients who undergo imaging studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero
2013-05-06
We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2011 CFR
2011-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2013 CFR
2013-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2010 CFR
2010-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
Code of Federal Regulations, 2014 CFR
2014-04-01
... systems. (k) Research activities to improve the basis for determining appropriate management measures to... current forestry funding and staffing levels; and standards providing quantitative criteria to evaluate...
ERIC Educational Resources Information Center
Heffernan, Bernadette M.
1998-01-01
Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…
Jin, Yan; Huang, Jing-feng; Peng, Dai-liang
2009-01-01
Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749
Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge
2014-04-01
The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.
Jin, Yan; Huang, Jing-feng; Peng, Dai-liang
2009-04-01
Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.
Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.
Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael
2017-06-20
High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-04-01
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-01-01
Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040
In Vitro Comparison of Adipokine Export Signals.
Sharafi, Parisa; Kocaefe, Y Çetin
2016-01-01
Mammalian cells are widely used for recombinant protein production in research and biotechnology. Utilization of export signals significantly facilitates production and purification processes. 35 years after the discovery of the mammalian export machinery, there still are obscurities regarding the efficiency of the export signals. The aim of this study was the comparative evaluation of the efficiency of selected export signals using adipocytes as a cell model. Adipocytes have a large capacity for protein secretion including several enzymes, adipokines, and other signaling molecules, providing a valid system for a quantitative evaluation. Constructs that expressed N-terminal fusion export signals were generated to express Enhanced Green Fluorescence Protein (EGFP) as a reporter for quantitative and qualitative evaluation. Furthermore, fluorescent microscopy was used to trace the intracellular traffic of the reporter. The export efficiency of six selected proteins secreted from adipocytes was evaluated. Quantitative comparison of intracellular and exported fractions of the recombinant constructs demonstrated a similar efficiency among the studied sequences with minor variations. The export signal of Retinol Binding Protein (RBP4) exhibited the highest efficiency. This study presents the first quantitative data showing variations among export signals, in adipocytes which will help optimization of recombinant protein distribution.
NASA Astrophysics Data System (ADS)
Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio
2015-07-01
Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.
Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S
2017-09-01
BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p <0.0001). Specificity improved from 52 percent to 79 percent ( p <0.0001). The positive predictive value increased from 61 percent to 81 percent ( p <0.01) when the quantitative data were provided. Negative predictive value also increased (68% vs. 91%, p<0.01), and overall biopsy accuracy was greater with multi-spectral digital skin lesion analysis (64% vs. 86%, p <0.001). Interrater reliability improved (intraclass correlation 0.466 before, 0.559 after). CONCLUSION: Incorporating the classifier score and probability data into physician evaluation of pigmented lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.
Collection Evaluation Techniques in the Academic Art Library.
ERIC Educational Resources Information Center
Kusnerz, Peggy Ann
1983-01-01
Presents an overview of library collection evaluation techniques described in the literature--list-checking, quantitative analysis, use studies, and subject specialist review--and offers suggestions to the librarian for the application of these methods in an art library. Twenty-five references are provided. (EJS)
Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career
Smibert, Dylan
2017-01-01
Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380
Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.
Uttl, Bob; Smibert, Dylan
2017-01-01
Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.
Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua
2014-04-01
To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.
Assessment of and standardization for quantitative nondestructive test
NASA Technical Reports Server (NTRS)
Neuschaefer, R. W.; Beal, J. B.
1972-01-01
Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.
Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.
2014-01-01
Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322
Gao, Fan; Rodriguez, Johanan; Kapp, Susan
2016-06-01
Harness fitting in the body-powered prosthesis remains more art than science due to a lack of consistent and quantitative evaluation. The aim of this study was to develop a mechanical, human-body-shaped apparatus to simulate body-powered upper limb prosthetic usage and evaluate its capability of quantitative examination of harness configuration. The apparatus was built upon a torso of a wooden mannequin and integrated major mechanical joints to simulate terminal device operation. Sensors were used to register cable tension, cable excursion, and grip force simultaneously. The apparatus allowed the scapula to move up to 127 mm laterally and the load cell can measure the cable tension up to 445 N. Our preliminary evaluation highlighted the needs and importance of investigating harness configurations in a systematic and controllable manner. The apparatus allows objective, systematic, and quantitative evaluation of effects of realistic harness configurations and will provide insightful and working knowledge on harness fitting in upper limb amputees using body-powered prosthesis. © The International Society for Prosthetics and Orthotics 2015.
Image segmentation evaluation for very-large datasets
NASA Astrophysics Data System (ADS)
Reeves, Anthony P.; Liu, Shuang; Xie, Yiting
2016-03-01
With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.
Lunar mineral feedstocks from rocks and soils: X-ray digital imaging in resource evaluation
NASA Technical Reports Server (NTRS)
Chambers, John G.; Patchen, Allan; Taylor, Lawrence A.; Higgins, Stefan J.; Mckay, David S.
1994-01-01
The rocks and soils of the Moon provide raw materials essential to the successful establishment of a lunar base. Efficient exploitation of these resources requires accurate characterization of mineral abundances, sizes/shapes, and association of 'ore' and 'gangue' phases, as well as the technology to generate high-yield/high-grade feedstocks. Only recently have x-ray mapping and digital imaging techniques been applied to lunar resource evaluation. The topics covered include inherent differences between lunar basalts and soils and quantitative comparison of rock-derived and soil-derived ilmenite concentrates. It is concluded that x-ray digital-imaging characterization of lunar raw materials provides a quantitative comparison that is unattainable by traditional petrographic techniques. These data are necessary for accurately determining mineral distributions of soil and crushed rock material. Application of these techniques will provide an important link to choosing the best raw material for mineral beneficiation.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
Quantiprot - a Python package for quantitative analysis of protein sequences.
Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold
2017-07-17
The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.
Miramar College Program Evaluation: Aviation Maintenance.
ERIC Educational Resources Information Center
Moriyama, Bruce; Brumley, Leslie
Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in aviation maintenance. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent…
Sachpekidis, Christos; Anwar, Hoda; Winkler, Julia K; Kopp-Schneider, Annette; Larribere, Lionel; Haberkorn, Uwe; Hassel, Jessica C; Dimitrakopoulou-Strauss, Antonia
2018-06-05
Immunotherapy has raised the issue of appropriate treatment response evaluation, due to the unique mechanism of action of the immunotherapeutic agents. Aim of this analysis is to evaluate the potential role of quantitative analysis of 2-deoxy-2-( 18 F)fluoro-D-glucose ( 18 F-FDG) positron emission tomography/computed tomography (PET/CT) data in monitoring of patients with metastatic melanoma undergoing ipilimumab therapy. 25 patients with unresectable metastatic melanoma underwent dynamic PET/CT (dPET/CT) of the thorax and upper abdomen as well as static, whole body PET/CT with 18 F-FDG before the start of ipilimumab treatment (baseline PET/CT), after two cycles of treatment (interim PET/CT) and at the end of treatment after four cycles (late PET/CT). The evaluation of dPET/CT studies was based on semi-quantitative (standardized uptake value, SUV) calculation as well as quantitative analysis, based on two-tissue compartment modeling and a fractal approach. Patients' best clinical response, assessed at a mean of 59 weeks, was used as reference. According to their best clinical response, patients were dichotomized in those demonstrating clinical benefit (CB, n = 16 patients) and those demonstrating no clinical benefit (no-CB, n = 9 patients). No statistically significant differences were observed between CB and no-CB regarding either semi-quantitative or quantitative parameters in all scans. On contrary, the application of the recently introduced PET response evaluation criteria for immunotherapy (PERCIMT) led to a correct classification rate of 84% (21/25 patients). Quantitative analysis of 18 F-FDG PET data does not provide additional information in treatment response evaluation of metastatic melanoma patients receiving ipilimumab. PERCIMT criteria correlated better with clinical response.
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.
Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse
2017-01-01
Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.
systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each
Miramar College Program Evaluation: Fire Science.
ERIC Educational Resources Information Center
Moriyama, Bruce; Brumley, Leslie
Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in fire sciences. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent to…
Object-Oriented Algorithm For Evaluation Of Fault Trees
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1992-01-01
Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).
NASA Astrophysics Data System (ADS)
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-11-01
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b
Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen
2016-01-01
Introduction We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. Methods and analysis The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. Ethics and dissemination The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. Trial registration number CTRI/2013/04/003557. PMID:27633636
Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.
Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F
2015-02-01
The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.
Exploring a taxonomy for aggression against women: can it aid conceptual clarity?
Cook, Sarah; Parrott, Dominic
2009-01-01
The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.
Evaluation of Deblur Methods for Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, William M.
2014-03-31
Radiography is used as a primary diagnostic for dynamic experiments, providing timeresolved radiographic measurements of areal mass density along a line of sight through the experiment. It is well known that the finite spot extent of the radiographic source, as well as scattering, are sources of blurring of the radiographic images. This blurring interferes with quantitative measurement of the areal mass density. In order to improve the quantitative utility of this diagnostic, it is necessary to deblur or “restore” the radiographs to recover the “true” areal mass density from a radiographic transmission measurement. Towards this end, I am evaluating threemore » separate methods currently in use for deblurring radiographs. I begin by briefly describing the problems associated with image restoration, and outlining the three methods. Next, I illustrate how blurring affects the quantitative measurements using radiographs. I then present the results of the various deblur methods, evaluating each according to several criteria. After I have summarized the results of the evaluation, I give a detailed account of how the restoration process is actually implemented.« less
CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis
Choi, Kyoungah; Lee, Impyeong
2015-01-01
We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909
Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.
Dyck, P J
1991-01-01
Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.
EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS
NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...
NASA Technical Reports Server (NTRS)
Mulhall, B. D. L.
1980-01-01
The performance, costs, organization and other characteristics of both the manual system and AIDS 2 were used to establish a baseline case. The results of the evaluation are to be used to determine the feasibility of the AIDS 3 System, as well as provide a basis for ranking alternative systems during the second phase of the JPL study. The results of the study were tabulated by subject, scope and methods, providing a descriptive, quantitative and qualitative analysis of the current operating systems employed by the FBI Identification Division.
Our goal is to establish an in vitro model system to evaluate chemical effects using a single stem cell culture technique that would improve throughput and provide quantitative markers of differentiation and cell number. To this end, we have used an adherent cell differentiation ...
ERIC Educational Resources Information Center
Bruning, Stephen D.
2002-01-01
Determines whether student-university relationship attitudes and satisfaction evaluations distinguished those who returned to a university from those who did not. Shows that respondent relationship attitudes differentiate those who returned to the university from those who did not, which provides a quantitative illustration of the benefits of…
Rory Saunders; Michael A. Bozek; Clayton J. Edwards; Martin J. Jennings; Steven P. Newman
2002-01-01
Evaluating spawning success in relation to habitat characteristics of nests sites provides critical information necessary to assess the effects riparian and littoral zone habitat alterations have on smallmouth bass Micropterus dolomieu survival and recruitment. The objective of this study was to quantitatively evaluate smallmouth bass nest site...
Student Evaluation of Teaching: The Use of Best-Worst Scaling
ERIC Educational Resources Information Center
Huybers, Twan
2014-01-01
An important purpose of student evaluation of teaching is to inform an educator's reflection about the strengths and weaknesses of their teaching approaches. Quantitative instruments are one way of obtaining student responses. They have traditionally taken the form of surveys in which students provide their responses to various statements using…
Initial Results of Illinois' Shifting Gears Pilot Demonstration Evaluation
ERIC Educational Resources Information Center
Bragg, Debra D.; Harmon, Timothy; Kirby, Catherine L.; Kim, Sujung
2009-01-01
This report provides initial results of Illinois' Shifting Gears Initiative that operated between July 1, 2007 and June 30, 2009. This mixed method (qualitative and quantitative) evaluation sought to accomplish three goals: (1) to assess program and student outcomes for two models (adult education and developmental education) for two target groups…
To label or not to label: applications of quantitative proteomics in neuroscience research.
Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W
2012-02-01
Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Reflectance spectroscopy for evaluating hair follicle cycle
NASA Astrophysics Data System (ADS)
Liu, Caihua; Guan, Yue; Wang, Jianru; Zhu, Dan
2014-02-01
Hair follicle, as a mini-organ with perpetually cycling of telogen, anagen and catagen, provides a valuable experimental model for studying hair and organ regeneration. The transition of hair follicle from telogen to anagen is a significant sign for successful regeneration. So far discrimination of the hair follicle stage is mostly based on canonical histological examination and empirical speculation based on skin color. Hardly a method has been proposed to quantitatively evaluate the hair follicle stage. In this work, a commercial optical fiber spectrometer was applied to monitor diffuse reflectance of mouse skin with hair follicle cycling, and then the change of reflectance was obtained. Histological examination was used to verify the hair follicle stage. In comparison with the histological examination, the skin diffuse reflectance was relatively high for mouse with telogen hair follicles; it decreased once hair follicles transited to anagen stage; then it increased reversely at catagen stage. This study provided a new method to quantitatively evaluate the hair follicle stage, and should be valuable for the basic and therapeutic investigations on hair regeneration.
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
McElvaine, M D; McDowell, R M; Fite, R W; Miller, L
1993-12-01
The United States Department of Agriculture, Animal and Plant Health Inspection Service (USDA-APHIS) has been exploring methods of quantitative risk assessment to support decision-making, provide risk management options and identify research needs. With current changes in world trade, regulatory decisions must have a scientific basis which is transparent, consistent, documentable and defensible. These quantitative risk assessment methods are described in an accompanying paper in this issue. In the present article, the authors provide an illustration by presenting an application of these methods. Prior to proposing changes in regulations, USDA officials requested an assessment of the risk of introduction of foreign animal disease to the United States of America through garbage from Alaskan cruise ships. The risk assessment team used a combination of quantitative and qualitative methods to evaluate this question. Quantitative risk assessment methods were used to estimate the amount of materials of foreign origin being sent to Alaskan landfills. This application of quantitative risk assessment illustrates the flexibility of the methods in addressing specific questions. By applying these methods, specific areas were identified where more scientific information and research were needed. Even with limited information, the risk assessment provided APHIS management with a scientific basis for a regulatory decision.
Current perspectives of CASA applications in diverse mammalian spermatozoa.
van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S
2018-03-26
Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.
Value of Information References
Morency, Christina
2014-12-12
This file contains a list of relevant references on value of information (VOI) in RIS format. VOI provides a quantitative analysis to evaluate the outcome of the combined technologies (seismology, hydrology, geodesy) used to monitor Brady's Geothermal Field.
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2013 CFR
2013-04-01
... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...
21 CFR 314.126 - Adequate and well-controlled studies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
42 CFR 431.424 - Evaluation requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials
Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.
2015-01-01
Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347
Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi
2016-03-01
This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.
Metrics and the effective computational scientist: process, quality and communication.
Baldwin, Eric T
2012-09-01
Recent treatments of computational knowledge worker productivity have focused upon the value the discipline brings to drug discovery using positive anecdotes. While this big picture approach provides important validation of the contributions of these knowledge workers, the impact accounts do not provide the granular detail that can help individuals and teams perform better. I suggest balancing the impact-focus with quantitative measures that can inform the development of scientists. Measuring the quality of work, analyzing and improving processes, and the critical evaluation of communication can provide immediate performance feedback. The introduction of quantitative measures can complement the longer term reporting of impacts on drug discovery. These metric data can document effectiveness trends and can provide a stronger foundation for the impact dialogue. Copyright © 2012 Elsevier Ltd. All rights reserved.
Athena: Providing Insight into the History of the Universe
NASA Technical Reports Server (NTRS)
Murphy, Gloria A.
2010-01-01
The American Institute for Aeronautics and Astronautics has provided a Request for Proposal which calls for a manned mission to a Near-Earth Object. It is the goal of Team COLBERT to respond to their request by providing a reusable system that can be implemented as a solid stepping stone for future manned trips to Mars and beyond. Despite Team COLBERT consisting of only students in Aerospace Engineering, in order to achieve this feat, the team must employ the use of Systems Engineering. Tools and processes from Systems Engineering will provide quantitative and semi-quantitative tools for making design decisions and evaluating items such as budgets and schedules. This paper will provide an in-depth look at some of the Systems Engineering processes employed and will step through the design process of a Human Asteroid Exploration System.
ERIC Educational Resources Information Center
Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.
2011-01-01
This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…
ERIC Educational Resources Information Center
Grinias, James P.
2017-01-01
Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…
ERIC Educational Resources Information Center
Bryans Bongey, Sarah
2012-01-01
This quantitative study involved 157 students in two sections of an undergraduate class in general biology, as well as one instructor who taught both sections of the course. It used resources from the Center for Applied Special Technologies (CAST) to evaluate the viability of a Learning Management System (LMS) to provide Universal Design for…
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations
Røttingen, John-Arne
2015-01-01
We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196
Sweat testing to evaluate autonomic function
Illigens, Ben M.W.; Gibbons, Christopher H.
2011-01-01
Sudomotor dysfunction is one of the earliest detectable neurophysiologic abnormalities in distal small fiber neuropathy. Traditional neurophysiologic measurements of sudomotor function include thermoregulatory sweat testing (TST), quantitative sudomotor axon reflex testing (QSART), silicone impressions, the sympathetic skin response (SSR), and the recent addition of quantitative direct and indirect axon reflex testing (QDIRT). These testing techniques, when used in combination, can detect and localized pre- and postganglionic lesions, can provide early diagnosis of sudomotor dysfunction and can monitor disease progression or disease recovery. In this article, we review the common tests available for assessment of sudomotor function, detail the testing methodology, review the limitations and provide examples of test results. PMID:18989618
Quantitative flow and velocity measurements of pulsatile blood flow with 4D-DSA
NASA Astrophysics Data System (ADS)
Shaughnessy, Gabe; Hoffman, Carson; Schafer, Sebastian; Mistretta, Charles A.; Strother, Charles M.
2017-03-01
Time resolved 3D angiographic data from 4D DSA provides a unique environment to explore physical properties of blood flow. Utilizing the pulsatility of the contrast waveform, the Fourier components can be used to track the waveform motion through vessels. Areas of strong pulsatility are determined through the FFT power spectrum. Using this method, we find an accuracy from 4D-DSA flow measurements within 7.6% and 6.8% RMSE of ICA PCVIPR and phantom flow probe validation measurements, respectively. The availability of velocity and flow information with fast acquisition could provide a more quantitative approach to treatment planning and evaluation in interventional radiology.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Quantitative metrics for evaluating the phased roll-out of clinical information systems.
Wong, David; Wu, Nicolas; Watkinson, Peter
2017-09-01
We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Wojczyńska, A; Leiggener, C S; Bredell, M; Ettlin, D A; Erni, S; Gallo, L M; Colombo, V
2016-10-01
The aim of this study was to qualitatively and quantitatively describe the biomechanics of existing total alloplastic reconstructions of temporomandibular joints (TMJ). Fifteen patients with unilateral or bilateral TMJ total joint replacements and 15 healthy controls were evaluated via dynamic stereometry technology. This non-invasive method combines three-dimensional imaging of the subject's anatomy with jaw tracking. It provides an insight into the patient's jaw joint movements in real time and provides a quantitative evaluation. The patients were also evaluated clinically for jaw opening, protrusive and laterotrusive movements, pain, interference with eating, and satisfaction with the joint replacements. The qualitative assessment revealed that condyles of bilateral total joint replacements displayed similar basic motion patterns to those of unilateral prostheses. Quantitatively, mandibular movements of artificial joints during opening, protrusion, and laterotrusion were all significantly shorter than those of controls. A significantly restricted mandibular range of motion in replaced joints was also observed clinically. Fifty-three percent of patients suffered from chronic pain at rest and 67% reported reduced chewing function. Nonetheless, patients declared a high level of satisfaction with the replacement. This study shows that in order to gain a comprehensive understanding of complex therapeutic measures, a multidisciplinary approach is needed. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Bennett N., E-mail: bennett.walker@fda.hhs.gov; Office of Device Evaluation, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993; James, Robert H.
Glare, glistenings, optical defects, dysphotopsia, and poor image quality are a few of the known deficiencies of intraocular lenses (IOLs). All of these optical phenomena are related to light scatter. However, the specific direction that light scatters makes a critical difference between debilitating glare and a slightly noticeable decrease in image quality. Consequently, quantifying the magnitude and direction of scattered light is essential to appropriately evaluate the safety and efficacy of IOLs. In this study, we introduce a full-angle scanning light scattering profiler (SLSP) as a novel approach capable of quantitatively evaluating the light scattering from IOLs with a nearlymore » 360° view. The SLSP method can simulate in situ conditions by controlling the parameters of the light source including angle of incidence. This testing strategy will provide a more effective nonclinical approach for the evaluation of IOL light scatter.« less
Vidueira, Pablo; Díaz-Puente, José M; Rivera, María
2014-08-01
Ex ante impact assessment has become a fundamental tool for effective program management, and thus, a compulsory task when establishing a new program in the European Union (EU). This article aims to analyze benefits from ex ante impact assessment, methodologies followed, and difficulties encountered. This is done through the case study on the rural development programs (RDPs) in the EU. Results regarding methodologies are then contrasted with the international context in order to provide solid insights to evaluators and program managing authorities facing ex ante impact assessment. All European RDPs from the period 2007 through 2013 (a total of 88) and their corresponding available ex ante evaluations (a total of 70) were analyzed focusing on the socioeconomic impact assessment. Only 46.6% of the regions provide quantified impact estimations on socioeconomic impacts in spite of it being a compulsory task demanded by the European Commission (EC). Recommended methods by the EC are mostly used, but there is a lack of mixed method approaches since qualitative methods are used in substitution of quantitative ones. Two main difficulties argued were the complexity of program impacts and the lack of needed program information. Qualitative approaches on their own have been found as not suitable for ex ante impact assessment, while quantitative approaches-such as microsimulation models-provide a good approximation to actual impacts. However, time and budgetary constraints make that quantitative and mixed methods should be mainly applied on the most relevant impacts for the program success. © The Author(s) 2014.
Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang
2015-04-01
To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology
Zhang, Wen; Cao, Jieer
2017-01-01
How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789
How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.
Zhang, Wen; Cao, Jieer; Xu, Jun
2017-01-01
How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
78 FR 52132 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-22
...: National Institute of Standards and Technology (NIST) Title: NIST MEP Advanced Manufacturing Jobs and... to provide Congress with quantitative information required for Government-supported programs. The... Recipient Evaluation. Analysis and Research. Reports to Stakeholders. Continuous Improvement. Knowledge...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... ask the client to evaluate the U.S. Commercial Service on its customer service provision. Results from... will enrich the quantitative survey data by providing insights and a descriptive context to explain the...
Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.
ERIC Educational Resources Information Center
Swiger, John; Klaus, Allen
1996-01-01
A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)
ERIC Educational Resources Information Center
Leger, Lawrence A.; Glass, Karligash; Katsiampa, Paraskevi; Liu, Shibo; Sirichand, Kavita
2017-01-01
We evaluate feedback methods for oral presentations used in training non-quantitative research skills (literature review and various associated tasks). Training is provided through a credit-bearing module taught to MSc students of banking, economics and finance in the UK. Monitoring oral presentations and providing "best practice"…
Cole, Kenneth D; Waite, Martha S; Nichols, Linda O
2003-01-01
For a nationwide Geriatric Interdisciplinary Team Training (GITT) program evaluation of 8 sites and 26 teams, team evaluators developed a quantitative and qualitative team observation scale (TOS), examining structure, process, and outcome, with specific focus on the training function. Qualitative data provided an important expansion of quantitative data, highlighting positive effects that were not statistically significant, such as role modeling and training occurring within the clinical team. Qualitative data could also identify "too much" of a coded variable, such as time spent in individual team members' assessments and treatment plans. As healthcare organizations have increasing demands for productivity and changing reimbursement, traditional models of teamwork, with large teams and structured meetings, may no longer be as functional as they once were. To meet these constraints and to train students in teamwork, teams of the future will have to make choices, from developing and setting specific models to increasing the use of information technology to create virtual teams. Both quantitative and qualitative data will be needed to evaluate these new types of teams and the important outcomes they produce.
Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung
2015-12-14
Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.
Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen
2016-09-15
We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. CTRI/2013/04/003557. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Title I ESEA, High School; English as a Second Language: 1979-1980. OEE Evaluation Report.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Educational Evaluation.
The report is an evaluation of the 1979-80 High School Title I English as a Second Language Program. Two types of information are presented: (1) a narrative description of the program which provides qualitative data regarding the program, and (2) a statistical analysis of test results which consists of quantitative, city-wide data. By integrating…
Hey girlfriend: an evaluation of AIDS prevention among women in the sex industry.
Dorfman, L E; Derish, P A; Cohen, J B
1992-01-01
Increasingly, acquired immunodeficiency syndrome (AIDS) prevention programs have been developed to reach and influence street-based populations. Standard methods of evaluation do not fit the conditions of such programs. This article describes a process and outcome evaluation of an AIDS prevention program for sex workers in which qualitative and quantitative methods were combined in order to mediate research problems endemic to street-based populations. Methods included epidemiological questionnaires, open-ended interviews with participants, and ethnographic field notes. Process evaluation findings show that field staff who were indigenous to the neighborhood and population readily gained access to the community of sex workers and simultaneously became role models for positive behavior change. Outcome findings show that sex workers do feel at risk for AIDS, but usually from clients rather than from husbands or boyfriends. Accordingly, they use condoms more frequently with clients than with steady partners. Increasing condom use among sex workers with their steady partners remains an important challenge for AIDS prevention. Combining qualitative and quantitative research data provided a more comprehensive assessment of how to reach sex workers with effective AIDS risk reduction messages than either method could have provided alone.
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-01-01
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-04-07
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.
Shilling, Val; Morris, Christopher; Thompson-Coon, Jo; Ukoumunne, Obioha; Rogers, Morwenna; Logan, Stuart
2013-07-01
To review the qualitative and quantitative evidence of the benefits of peer support for parents of children with disabling conditions in the context of health, well-being, impact on family, and economic and service implications. We comprehensively searched multiple databases. Eligible studies evaluated parent-to-parent support and reported on the psychological health and experience of giving or receiving support. There were no limits on the child's condition, study design, language, date, or setting. We sought to aggregate quantitative data; findings of qualitative studies were combined using thematic analysis. Qualitative and quantitative data were brought together in a narrative synthesis. Seventeen papers were included: nine qualitative studies, seven quantitative studies, and one mixed-methods evaluation. Four themes were identified from qualitative studies: (1) shared social identity, (2) learning from the experiences of others, (3) personal growth, and (4) supporting others. Some quantitative studies reported a positive effect of peer support on psychological health and other outcomes; however, this was not consistently confirmed. It was not possible to aggregate data across studies. No costing data were identified. Qualitative studies strongly suggest that parents perceive benefit from peer support programmes, an effect seen across different types of support and conditions. However, quantitative studies provide inconsistent evidence of positive effects. Further research should explore whether this dissonance is substantive or an artefact of how outcomes have been measured. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.
Mamou, Jonathan; Wa, Christianne A; Yee, Kenneth M P; Silverman, Ronald H; Ketterling, Jeffrey A; Sadun, Alfredo A; Sebag, J
2015-01-22
Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.
Mamou, Jonathan; Wa, Christianne A.; Yee, Kenneth M. P.; Silverman, Ronald H.; Ketterling, Jeffrey A.; Sadun, Alfredo A.; Sebag, J.
2015-01-01
Purpose. Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Methods. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Results. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Conclusions. Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. PMID:25613948
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
The role of quantitative safety evaluation in regulatory decision making of drugs.
Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat
2016-01-01
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue
Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-01-01
Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.
Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-04-01
To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.
Intrinsic and Extrinsic Evaluation of Spatiotemporal Text Representations in Twitter Streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Lawrence A.; Shaffer, Kyle J.; Arendt, Dustin L.
Language in social media is a dynamic system, constantly evolving and adapting, with words and concepts rapidly emerging, disappearing, and changing their meaning. These changes can be estimated using word representations in context, over time and across locations. A number of methods have been proposed to track these spatiotemporal changes but no general method exists to evaluate the quality of these representations. Previous work largely focused on qualitative evaluation, which we improve by proposing a set of visualizations that highlight changes in text representation over both space and time. We demonstrate usefulness of novel spatiotemporal representations to explore and characterizemore » specific aspects of the corpus of tweets collected from European countries over a two-week period centered around the terrorist attacks in Brussels in March 2016. In addition, we quantitatively evaluate spatiotemporal representations by feeding them into a downstream classification task – event type prediction. Thus, our work is the first to provide both intrinsic (qualitative) and extrinsic (quantitative) evaluation of text representations for spatiotemporal trends.« less
Quantitative proteomics in cardiovascular research: global and targeted strategies
Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun
2014-01-01
Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501
Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2016-01-01
Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful.
Manolov, Rumen; Losada, José L.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2016-01-01
Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful. PMID:26834691
Hildon, Zoe; Allwood, Dominique; Black, Nick
2012-02-01
Displays comparing the performance of healthcare providers are largely based on commonsense. To review the literature on the impact of compositional format and content of quantitative data displays on people's comprehension, choice and preference. Ovid databases, expert recommendations and snowballing techniques. Evaluations of the impact of different formats (bar charts, tables and pictographs) and content (ordering, explanatory visual cues, etc.) of quantitative data displays meeting defined quality criteria. Data extraction Type of decision; decision-making domains; audiences; formats; content; methodology; findings. Most of the 30 studies used quantitative (n= 26) methods with patients or public groups (n= 28) rather than with professionals (n= 2). Bar charts were the most frequent format, followed by pictographs and tables. As regards format, tables and pictographs appeared better understood than bar charts despite the latter being preferred. Although accessible to less numerate and older populations, pictographs tended to lead to more risk avoidance. Tables appeared accessible to all. Aspects of content enhancing the impact of data displays included giving visual explanatory cues and contextual information while still attempting simplicity ('less is more'); ordering data; consistency. Icons rather than numbers were more user-friendly but could lead to over-estimation of risk. Uncertainty was not widely understood, nor well represented. Though heterogeneous and limited in scope, there is sufficient research evidence to inform the presentation of quantitative data that compares the performance of healthcare providers. The impact of new formats, such as funnel plots, needs to be evaluated.
Fischbach-Boulanger, C; Fitsiori, A; Noblet, V; Baloglu, S; Oesterle, H; Draghici, S; Philippi, N; Duron, E; Hanon, O; Dietemann, J-L; Blanc, F; Kremer, S
2018-05-01
Magnetic resonance imaging is part of the diagnostic criteria for Alzheimer's disease (AD) through the evaluation of hippocampal atrophy. The objective of this study was to evaluate which sequence of T1-weighted (T1WI) and T2-weighted (T2WI) imaging allowed the best visual evaluation of hippocampal atrophy. Visual qualitative ratings of the hippocampus of 100 patients with mild cognitive impairment (MCI) and 50 patients with AD were made independently by four operators according to the medial temporal lobe atrophy score based either on T1WI or T2WI. These two evaluations were compared in terms of interobserver reproducibility, concordance with a quantitative volumetric measure, discrimination power between AD and MCI groups, and correlation with several neuropsychological tests. The medial temporal lobe atrophy score evaluated on either T1WI or T2WI exhibited similar interobserver variability and accordance with quantitative volumetric evaluation. However, the visual evaluation on T2WI seemed to provide better discrimination power between AD and MCI groups for both left (T1WI, P = 0.0001; T2WI, P = 7.072 × 10 -5 ) and right (T1WI, P = 0.008; T2WI, P = 0.001) hippocampus, and a higher overall correlation with neuropsychological tests. The present study suggests that T2WI provides a more adequate visual rating of hippocampal atrophy. © 2018 EAN.
Use of Nucleic Acid-Based Tools for Monitoring Biostimulation and Bioaugmentation
2011-01-01
dechlorination is a promising process for biodegradation of chlorinated solvents. The successful field evaluation and implementation of the reductive...These specialized bacteria use the chlorinated ethenes as electron acceptors and gain energy for growth from the reductive dechlorination reactions...protocol addresses the use of MBTs to quantitatively assess the Dhc population at chlorinated ethene sites and aims at providing guidance to evaluate
Assessment of Scheduling and Plan Execution of Apollo 14 Lunar Surface Operations
NASA Technical Reports Server (NTRS)
Marquez, Jessica J.
2010-01-01
Although over forty years have passed since first landing on the Moon, there is not yet a comprehensive, quantitative assessment of Apollo extravehicular activities (EVAs). Quantitatively evaluating lunar EVAs will provide a better understanding of the challenges involved with surface operations. This first evaluation of a surface EVA centers on comparing the planned and the as-ran timeline, specifically collecting data on discrepancies between durations that were estimated versus executed. Differences were summarized by task categories in order to gain insight as to the type of surface operation activities that were most challenging. One Apollo 14 EVA was assessed utilizing the described methodology. Selected metrics and task categorizations were effective, and limitations to this process were identified.
Polarization variations in installed fibers and their influence on quantum key distribution systems.
Ding, Yu-Yang; Chen, Hua; Wang, Shuang; He, De-Yong; Yin, Zhen-Qiang; Chen, Wei; Zhou, Zheng; Guo, Guang-Can; Han, Zheng-Fu
2017-10-30
Polarization variations in the installed fibers are complex and volatile, and would severely affect the performances of polarization-sensitive quantum key distribution (QKD) systems. Based on the recorded data about polarization variations of different installed fibers, we establish an analytical methodology to quantitatively evaluate the influence of polarization variations on polarization-sensitive QKD systems. Using the increased quantum bit error rate induced by polarization variations as a key criteria, we propose two parameters - polarization drift time and required tracking speed - to characterize polarization variations. For field buried and aerial fibers with different length, we quantitatively evaluate the influence of polarization variations, and also provide requirements and suggestions for polarization basis alignment modules of QKD systems deployed in different kind of fibers.
Ohtsuki, Sumio; Hirayama, Mio; Ito, Shingo; Uchida, Yasuo; Tachikawa, Masanori; Terasaki, Tetsuya
2014-06-01
The blood-brain barrier (BBB) is formed by brain capillary endothelial cells linked together via complex tight junctions, and serves to prevent entry of drugs into the brain. Multiple transporters are expressed at the BBB, where they control exchange of materials between the circulating blood and brain interstitial fluid, thereby supporting and protecting the CNS. An understanding of the BBB is necessary for efficient development of CNS-acting drugs and to identify potential drug targets for treatment of CNS diseases. Quantitative targeted proteomics can provide detailed information on protein expression levels at the BBB. The present review highlights the latest applications of quantitative targeted proteomics in BBB research, specifically to evaluate species and in vivo-in vitro differences, and to reconstruct in vivo transport activity. Such a BBB quantitative proteomics approach can be considered as pharmacoproteomics.
Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan
2018-01-01
Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.
Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne
2016-01-30
Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Knowles, D.B.
1955-01-01
The objective of the Ground Water Branch is to evaluate the occurrence, availability, and quality of ground water. The science of ground-water hydrology is applied toward attaining that goal. Although many ground-water investigations are of a qualitative nature, quantitative studies are necessarily an integral component of the complete evaluation of occurrence and availability. The worth of an aquifer as a fully developed source of water depends largely on two inherent characteristics: its ability to store, and its ability to transmit water. Furthermore, quantitative knowledge of these characteristics facilitates measurement of hydrologic entities such as recharge, leakage, evapotranspiration, etc. It is recognized that these two characteristics, referred to as the coefficients of storage and transmissibility, generally provide the very foundation on which quantitative studies are constructed. Within the science of ground-water hydrology, ground-water hydraulics methods are applied to determine these constats from field data.
Austin, John H. M.; Hogg, James C.; Grenier, Philippe A.; Kauczor, Hans-Ulrich; Bankier, Alexander A.; Barr, R. Graham; Colby, Thomas V.; Galvin, Jeffrey R.; Gevenois, Pierre Alain; Coxson, Harvey O.; Hoffman, Eric A.; Newell, John D.; Pistolesi, Massimo; Silverman, Edwin K.; Crapo, James D.
2015-01-01
The purpose of this statement is to describe and define the phenotypic abnormalities that can be identified on visual and quantitative evaluation of computed tomographic (CT) images in subjects with chronic obstructive pulmonary disease (COPD), with the goal of contributing to a personalized approach to the treatment of patients with COPD. Quantitative CT is useful for identifying and sequentially evaluating the extent of emphysematous lung destruction, changes in airway walls, and expiratory air trapping. However, visual assessment of CT scans remains important to describe patterns of altered lung structure in COPD. The classification system proposed and illustrated in this article provides a structured approach to visual and quantitative assessment of COPD. Emphysema is classified as centrilobular (subclassified as trace, mild, moderate, confluent, and advanced destructive emphysema), panlobular, and paraseptal (subclassified as mild or substantial). Additional important visual features include airway wall thickening, inflammatory small airways disease, tracheal abnormalities, interstitial lung abnormalities, pulmonary arterial enlargement, and bronchiectasis. © RSNA, 2015 PMID:25961632
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
Criteria for quantitative and qualitative data integration: mixed-methods research methodology.
Lee, Seonah; Smith, Carrol A M
2012-05-01
Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.
Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima
2016-01-01
Introduction Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Methods Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Results Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. Conclusions The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.
Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima
2016-01-01
Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.
Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea
2016-10-01
Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena
2012-01-01
This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.
Niwa, Masahiro; Hiraishi, Yasuhiro
2014-01-30
Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. Copyright © 2013 Elsevier B.V. All rights reserved.
Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Dias, Fernando José
2017-01-01
Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (−0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments. PMID:29075647
Fuentes, Ramón; Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Kuramochi, Gisaku; Dias, Fernando José
2017-01-01
Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (-0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments.
Tataw, David Besong; Ekúndayò, Olúgbémiga T
2017-01-01
This article reports on the use of sequential and integrated mixed-methods approach in a focused population and small-area analysis. The study framework integrates focus groups, survey research, and community engagement strategies in a search for evidence related to prostate cancer screening services utilization as a component of cancer prevention planning in a marginalized African American community in the United States. Research and data analysis methods are synthesized by aggregation, configuration, and interpretive analysis. The results of synthesis show that qualitative and quantitative data validate and complement each other in advancing our knowledge of population characteristics, variable associations, the complex context in which variables exist, and the best options for prevention and service planning. Synthesis of findings and interpretive analysis provided two important explanations which seemed inexplicable in regression outputs: (a) Focus group data on the limitations of the church as an educational source explain the negative association between preferred educational channels and screening behavior found in quantitative analysis. (b) Focus group data on unwelcoming provider environments explain the inconsistent relationship between knowledge of local sites and screening services utilization found in quantitative analysis. The findings suggest that planners, evaluators, and scientists should grow their planning and evaluation evidence from the community they serve.
Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk
2017-09-06
In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study will evaluate a cardiology PC+ centre using quantitative and supplementary qualitative methods. The findings of both sub-studies will fill a gap in knowledge about the effects of PC+ and in particular whether PC+ is able to pursue the Triple Aim outcomes. NTR6629 (Data registered: 25-08-2017) (registered retrospectively).
US EPA’s ToxCast research program evaluates bioactivity for thousands of chemicals utilizing high-throughput screening assays to inform chemical testing decisions. Vala Sciences provides high content, multiplexed assays that utilize quantitative cell-based digital image analysis....
Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home
Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.
2010-01-01
PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398
Tanaka, Yohei; Tsunemi, Yuichiro; Kawashima, Makoto; Tatewaki, Naoto; Nishida, Hiroshi
2013-01-01
Background Near-infrared has been shown to penetrate deeper than optical light sources independent of skin color, allowing safer treatment for the Asian skin type. Many studies have indicated the efficacy of various types of devices, but have not included a sufficiently objective evaluation. In this study, we used three-dimensional imaging for objective evaluation of facial skin tightening using a water-filtered near-infrared device. Methods Twenty Japanese patients were treated with the water-filtered near-infrared (1,000–1,800 nm) device using a contact-cooling and nonfreezing gel stored in a freezer. Three-dimensional imaging was performed, and quantitative volume measurements were taken to evaluate the change in post-treatment volume. The patients then provided their subjective assessments. Results Objective assessments of the treated cheek volume evaluated by a three-dimensional color schematic representation with quantitative volume measurements showed significant improvement 3 months after treatment. The mean volume reduction at the last post-treatment visit was 2.554 ± 0.999 mL. The post-treatment volume was significantly reduced compared with the pretreatment volume in all patients (P < 0.0001). Eighty-five percent of patients reported satisfaction with the improvement of skin laxity, and 80% of patients reported satisfaction with improvement of rhytids, such as the nasolabial folds. Side effects, such as epidermal burns and scar formation, were not observed throughout the study. Conclusion The advantages of this water-filtered near-infrared treatment are its high efficacy for skin tightening, associated with a minimal level of discomfort and minimal side effects. Together, these characteristics facilitate our ability to administer repeated treatments and provide alternative or adjunctive treatment for patients, with improved results. This study provides a qualitative and quantitative volumetric assessment, establishing the ability of this technology to reduce volume through noninvasive skin tightening. PMID:23837000
Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B
2014-02-01
Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.
Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.
2013-01-01
Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875
NASA Astrophysics Data System (ADS)
Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.
1997-12-01
Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.
A method for evaluating the murine pulmonary vasculature using micro-computed tomography.
Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E
2017-01-01
Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-01-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
NASA Astrophysics Data System (ADS)
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-11-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M
2016-01-01
This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087
National Security Technology Incubator Evaluation Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
This report describes the process by which the National Security Technology Incubator (NSTI) will be evaluated. The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report includes a brief description of the components, steps, and measures of the proposed evaluation process. The purpose of the NSPP is to promote national security technologies through business incubation, technology demonstration and validation, and workforce development. The NSTI will focus on serving businesses with national security technology applications by nurturing them through critical stages ofmore » early development. An effective evaluation process of the NSTI is an important step as it can provide qualitative and quantitative information on incubator performance over a given period. The vision of the NSTI is to be a successful incubator of technologies and private enterprise that assist the NNSA in meeting new challenges in national safety and security. The mission of the NSTI is to identify, incubate, and accelerate technologies with national security applications at various stages of development by providing hands-on mentoring and business assistance to small businesses and emerging or growing companies. To achieve success for both incubator businesses and the NSTI program, an evaluation process is essential to effectively measure results and implement corrective processes in the incubation design if needed. The evaluation process design will collect and analyze qualitative and quantitative data through performance evaluation system.« less
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
DWI at MR Enterography for Evaluating Bowel Inflammation in Crohn Disease.
Park, Seong Ho
2016-07-01
The purpose of this article is to provide a comprehensive review regarding DWI enterography used for evaluating Crohn disease and to summarize the relevant evidence. Active bowel inflammation in Crohn disease causes restricted diffusion on MR enterography with DWI. Enterographic DWI to evaluate Crohn disease is increasingly drawing attention for both academic research and clinical practice and has shown potential as a quantitative tool for assessing bowel inflammation. DWI enterography also has multiple unresolved issues and limitations.
Utilization potential evaluation of plant resources in the dry-hot valley of Jinsha River
NASA Astrophysics Data System (ADS)
Xi, Rong; Xu, Naizhong; Liu, Shengxiang; Ren, Tingyan
2017-08-01
Plant resources in the dry-hot valley of Jinsha River are endemic to a class of district. The article adopts the analytic hierarchy process method to evaluate the exploitation and utilization potential of plant resources of thirty typical plant resources on the basis of their characteristics in the dry-hot valley of Jinsha River, which provide scientific evidence for quantitative evaluation of regional plant resources, and we also suggest pathways offering protection and development.
The use of a battery of tracking tests in the quantitative evaluation of neurological function
NASA Technical Reports Server (NTRS)
Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.
1972-01-01
A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.
Patient-Centered Communication and Health Assessment with Youth
Munro, Michelle L.; Darling-Fisher, Cynthia S.; Ronis, David L.; Villarruel, Antonia M.; Pardee, Michelle; Faleer, Hannah; Fava, Nicole M.
2014-01-01
Background Patient-centered communication is the hallmark of care that incorporates the perspective of patients to provide tailored care that meets their needs and desires. However, at this time there has been limited evaluation of patient-provider communication involving youth. Objectives This manuscript will report on results from secondary analysis of data obtained during a participatory research-based randomized control trial designed to test a sexual risk event history calendar intervention with youth to address the following research questions: (a) Based on the event history calendar’s (EHC) inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client involvement, client satisfaction, patient-provider interaction, and patient-centeredness) when compared to the Guidelines for Adolescent Preventive Services (GAPS) tool? and (b) How do patients and providers describe the characteristics of each tool in regards to patient-centered communication? Method This report will utilize a sequential explanatory mixed methods approach to evaluate communication. A split plot design with one between factor (i.e., communication structure between EHC and GAPS) and one within factor (i.e., time between pretest and posttest) was used for analyses of data collection from male and female youth (n=186) and providers (n=9). Quantitative analysis of survey data evaluated changes in communication from pre-test to post-test. Qualitative data collected from open-ended questions, audio-taped visits, and exit interviews was employed to enhance interpretation of quantitative findings. Results Patient-centered communication using assessment tools (EHC and GAPS) with youth demonstrated improved communication outcomes both quantitatively and qualitatively. Additional analyses with subgroups of males and Arab-Americans demonstrated better post-intervention scores among the EHC group in certain aspects of communication. Qualitative results revealed that the EHC demonstrated improved outcomes in the four components of patient-centered communication including: validation of the patient’s perspective; viewing the patient within context; reaching a shared understanding on needs and preferences; and helping the patient share power in the healthcare interaction. Discussion Though both tools provided a framework from which to conduct a clinical visit, the integrated time-linked assessment captured by the EHC enhanced the patient-centered communication in select groups compared to GAPS. PMID:24165214
Alvarez Del Arco, D; Rodríguez Rieiro, C; Sanchidrián De Blás, C; Alejos, B; Plá Mestre, R
2012-01-01
We examined the usefulness of incorporating a qualitative phase in the evaluation of the quality of care in a high-resolution medical service carried out with quantitative methods. A quantitative research was performed using a structured questionnaire and selecting interviewees by systematic randomized sampling methods (n=320). In addition, a qualitative research was carried on through semi-structured interviews with patients selected by convenience criteria (n=11), observations in the care assistance circuit, and a group interview with health professionals working in the service. A multidisciplinary research team conducted an individual analysis of the information collected in both quantitative and qualitative phases. Subsequently, three meetings based on group brainstorming techniques were held to identify the diverse contributions of each of the methodologies employed to the research, using affinity graphs to analyse the different results obtained in both phases and evaluate possible bias arising from the use of qualitative methods. Qualitative research allowed examining specific aspects of the health care service that had been collected in the quantitative phase, harmonizing the results obtained in the previous phase, giving in-depth data on the reasons for patient dissatisfaction with specific aspects, such as waiting times and available infrastructures, and identifying emerging issues of the service which had not been previously assessed. Overall, the qualitative phase enriched the results of the research. It is appropriate and recommendable to incorporate this methodological approach in research aimed at evaluating the quality of the service in specific health care settings, since it is provided first hand, by the voice of the customer. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.
Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S
2017-03-17
Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.
Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.
Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo
2018-05-01
This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Freed, Melanie; de Zwart, Jacco A; Hariharan, Prasanna; Myers, Matthew R; Badano, Aldo
2011-10-01
To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml/s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml/s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions.
Statistical modeling for visualization evaluation through data fusion.
Chen, Xiaoyu; Jin, Ran
2017-11-01
There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia
2003-01-01
Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.
Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W
2015-02-01
In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.
How important is aspirin adherence when evaluating effectiveness of low-dose aspirin?
Navaratnam, Kate; Alfirevic, Zarko; Pirmohamed, Munir; Alfirevic, Ana
2017-12-01
Low-dose aspirin (LDA) is advocated for women at high-risk of pre-eclampsia, providing a modest, 10%, reduction in risk. Cardiology meta-analyses demonstrate 18% reduction in serious vascular events with LDA. Non-responsiveness to aspirin (sometimes termed aspirin resistance) and variable clinical effectiveness are often attributed to suboptimal adherence. The aim of this review was to identify the scope of adherence assessments in RCTs evaluating aspirin effectiveness in cardiology and obstetrics and discuss the quality of information provided by current methods. We searched MEDLINE, EMBASE and the Cochrane Library, limited to humans and English language, for RCTs evaluating aspirin in cardiology; 14/03/13-13/03/16 and pregnancy 1957-13/03/16. Search terms; 'aspirin', 'acetylsalicylic acid' appearing adjacent to 'myocardial infarction' or 'pregnancy', 'pregnant', 'obstetric' were used. 38% (25/68) of obstetric and 32% (20/62) of cardiology RCTs assessed aspirin adherence and 24% (6/25) and 29% (6/21) of obstetric and cardiology RCTs, respectively, defined acceptable adherence. Semi-quantitative methods (pill counts, medication weighing) prevailed in obstetric RCTs (93%), qualitative methods (interviews, questionnaires) were more frequent in obstetrics (67%). Two obstetric RCTs quantified serum thromboxane B 2 and salicylic acid, but no quantitative methods were used in cardiology Aspirin has proven efficacy, but suboptimal adherence is widespread and difficult to accurately quantify. Little is currently known about aspirin adherence in pregnancy. RCTs evaluating aspirin effectiveness show over-reliance on qualitative adherence assessments vulnerable to inherent inaccuracies. Reliable adherence data is important to assess and optimise the clinical effectiveness of LDA. We propose that adherence should be formally assessed in future trials and that development of quantitative assessments may prove valuable for trial protocols. Copyright © 2017 Elsevier B.V. All rights reserved.
Wang, Shuo; Poon, Gregory M K; Wilson, W David
2015-01-01
Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.
Zhao, Cheng; Trudeau, Beth; Xie, Helen; Prostko, John; Fishpaugh, Jeffrey; Ramsay, Carol
2014-06-01
The absolute quantitation of the targeted protein using MS provides a promising method to evaluate/verify biomarkers used in clinical diagnostics. In this study, a cardiac biomarker, troponin I (TnI), was used as a model protein for method development. The epitope peptide of TnI was characterized by epitope excision followed with LC/MS/MS method and acted as the surrogate peptide for the targeted protein quantitation. The MRM-based MS assay using a stable internal standard that improved the selectivity, specificity, and sensitivity of the protein quantitation. Also, plasma albumin depletion and affinity enrichment of TnI by anti-TnI mAb-coated microparticles reduced the sample complexity, enhanced the dynamic range, and further improved the detecting sensitivity of the targeted protein in the biological matrix. Therefore, quantitation of TnI, a low abundant protein in human plasma, has demonstrated the applicability of the targeted protein quantitation strategy through its epitope peptide determined by epitope mapping method. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan
2014-05-30
We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.
Buckler, Andrew J; Bresolin, Linda; Dunnick, N Reed; Sullivan, Daniel C; Aerts, Hugo J W L; Bendriem, Bernard; Bendtsen, Claus; Boellaard, Ronald; Boone, John M; Cole, Patricia E; Conklin, James J; Dorfman, Gary S; Douglas, Pamela S; Eidsaunet, Willy; Elsinger, Cathy; Frank, Richard A; Gatsonis, Constantine; Giger, Maryellen L; Gupta, Sandeep N; Gustafson, David; Hoekstra, Otto S; Jackson, Edward F; Karam, Lisa; Kelloff, Gary J; Kinahan, Paul E; McLennan, Geoffrey; Miller, Colin G; Mozley, P David; Muller, Keith E; Patt, Rick; Raunig, David; Rosen, Mark; Rupani, Haren; Schwartz, Lawrence H; Siegel, Barry A; Sorensen, A Gregory; Wahl, Richard L; Waterton, John C; Wolf, Walter; Zahlmann, Gudrun; Zimmerman, Brian
2011-06-01
Quantitative imaging biomarkers could speed the development of new treatments for unmet medical needs and improve routine clinical care. However, it is not clear how the various regulatory and nonregulatory (eg, reimbursement) processes (often referred to as pathways) relate, nor is it clear which data need to be collected to support these different pathways most efficiently, given the time- and cost-intensive nature of doing so. The purpose of this article is to describe current thinking regarding these pathways emerging from diverse stakeholders interested and active in the definition, validation, and qualification of quantitative imaging biomarkers and to propose processes to facilitate the development and use of quantitative imaging biomarkers. A flexible framework is described that may be adapted for each imaging application, providing mechanisms that can be used to develop, assess, and evaluate relevant biomarkers. From this framework, processes can be mapped that would be applicable to both imaging product development and to quantitative imaging biomarker development aimed at increasing the effectiveness and availability of quantitative imaging. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.10100800/-/DC1. RSNA, 2011
General Science Education Billed Inadequate.
ERIC Educational Resources Information Center
Chemical and Engineering News, 1980
1980-01-01
Reviews findings of a government report commissioned by the White House to evaluate the quantitative and qualitative adequacy of professional scientists and engineers at all degree levels, as well as the capacity of the United States educational system to provide good scientific and technical education for all Americans. (CS)
Simple X-ray diffraction algorithm for direct determination of cotton crystallinity
USDA-ARS?s Scientific Manuscript database
Traditionally, XRD had been used to study the crystalline structure of cotton celluloses. Despite considerable efforts in developing the curve-fitting protocol to evaluate the crystallinity index (CI), in its present state, XRD measurement can only provide a qualitative or semi-quantitative assessme...
Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in a large number of resources. The new paradigm of testing approaches involves rapid screening studies able to evaluate thousands of chemicals acro...
Networked Resources, Assessment and Collection Development
ERIC Educational Resources Information Center
Samson, Sue; Derry, Sebastian; Eggleston, Holly
2004-01-01
This project provides a critical evaluation of networked resources as they relate to the library's collection development policy, identifies areas of the curriculum not well represented, establishes a reliable method of assessing usage across all resources, and develops a framework of quantitative data for collection development decision making.
NASA Astrophysics Data System (ADS)
Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn
2016-03-01
Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.
Barr, R Graham; Berkowitz, Eugene A; Bigazzi, Francesca; Bode, Frederick; Bon, Jessica; Bowler, Russell P; Chiles, Caroline; Crapo, James D; Criner, Gerard J; Curtis, Jeffrey L; Dass, Chandra; Dirksen, Asger; Dransfield, Mark T; Edula, Goutham; Erikkson, Leif; Friedlander, Adam; Galperin-Aizenberg, Maya; Gefter, Warren B; Gierada, David S; Grenier, Philippe A; Goldin, Jonathan; Han, MeiLan K; Hanania, Nicola A; Hansel, Nadia N; Jacobson, Francine L; Kauczor, Hans-Ulrich; Kinnula, Vuokko L; Lipson, David A; Lynch, David A; MacNee, William; Make, Barry J; Mamary, A James; Mann, Howard; Marchetti, Nathaniel; Mascalchi, Mario; McLennan, Geoffrey; Murphy, James R; Naidich, David; Nath, Hrudaya; Newell, John D; Pistolesi, Massimo; Regan, Elizabeth A; Reilly, John J; Sandhaus, Robert; Schroeder, Joyce D; Sciurba, Frank; Shaker, Saher; Sharafkhaneh, Amir; Silverman, Edwin K; Steiner, Robert M; Strange, Charlton; Sverzellati, Nicola; Tashjian, Joseph H; van Beek, Edwin J R; Washington, Lacey; Washko, George R; Westney, Gloria; Wood, Susan A; Woodruff, Prescott G
2012-04-01
The purposes of this study were: to describe chest CT findings in normal non-smoking controls and cigarette smokers with and without COPD; to compare the prevalence of CT abnormalities with severity of COPD; and to evaluate concordance between visual and quantitative chest CT (QCT) scoring. Volumetric inspiratory and expiratory CT scans of 294 subjects, including normal non-smokers, smokers without COPD, and smokers with GOLD Stage I-IV COPD, were scored at a multi-reader workshop using a standardized worksheet. There were 58 observers (33 pulmonologists, 25 radiologists); each scan was scored by 9-11 observers. Interobserver agreement was calculated using kappa statistic. Median score of visual observations was compared with QCT measurements. Interobserver agreement was moderate for the presence or absence of emphysema and for the presence of panlobular emphysema; fair for the presence of centrilobular, paraseptal, and bullous emphysema subtypes and for the presence of bronchial wall thickening; and poor for gas trapping, centrilobular nodularity, mosaic attenuation, and bronchial dilation. Agreement was similar for radiologists and pulmonologists. The prevalence on CT readings of most abnormalities (e.g. emphysema, bronchial wall thickening, mosaic attenuation, expiratory gas trapping) increased significantly with greater COPD severity, while the prevalence of centrilobular nodularity decreased. Concordances between visual scoring and quantitative scoring of emphysema, gas trapping and airway wall thickening were 75%, 87% and 65%, respectively. Despite substantial inter-observer variation, visual assessment of chest CT scans in cigarette smokers provides information regarding lung disease severity; visual scoring may be complementary to quantitative evaluation.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Wang, Lei; Pedersen, Peder C; Strong, Diane M; Tulu, Bengisu; Agu, Emmanuel; Ignotz, Ron; He, Qian
2015-08-07
For individuals with type 2 diabetes, foot ulcers represent a significant health issue. The aim of this study is to design and evaluate a wound assessment system to help wound clinics assess patients with foot ulcers in a way that complements their current visual examination and manual measurements of their foot ulcers. The physical components of the system consist of an image capture box, a smartphone for wound image capture and a laptop for analyzing the wound image. The wound image assessment algorithms calculate the overall wound area, color segmented wound areas, and a healing score, to provide a quantitative assessment of the wound healing status both for a single wound image and comparisons of subsequent images to an initial wound image. The system was evaluated by assessing foot ulcers for 12 patients in the Wound Clinic at University of Massachusetts Medical School. As performance measures, the Matthews correlation coefficient (MCC) value for the wound area determination algorithm tested on 32 foot ulcer images was .68. The clinical validity of our healing score algorithm relative to the experienced clinicians was measured by Krippendorff's alpha coefficient (KAC) and ranged from .42 to .81. Our system provides a promising real-time method for wound assessment based on image analysis. Clinical comparisons indicate that the optimized mean-shift-based algorithm is well suited for wound area determination. Clinical evaluation of our healing score algorithm shows its potential to provide clinicians with a quantitative method for evaluating wound healing status. © 2015 Diabetes Technology Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-03-01
This Handbook deals with environmental characterization information for a range of energy-supply systems and provides supplementary information on environmental controls applicable to a select group of environmentally characterized energy systems. Environmental residuals, physical-resource requirements, and discussion of applicable standards are the principal information provided. The quantitative and qualitative data provided are useful for evaluating alternative policy and technical strategies and for assessing the environmental impact of facility siting, energy production, and environmental controls.
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.; Nickl, E.; Ferraro, R. R.
2017-12-01
This study evaluates the ability of different satellite-based precipitation products to capture daily precipitation extremes over the entire globe. The satellite products considered are the datasets belonging to the Reference Environmental Data Records (REDRs) program (PERSIANN-CDR, GPCP, CMORPH, AMSU-A,B, Hydrologic bundle). Those products provide long-term global records of daily adjusted Quantitative Precipitation Estimates (QPEs) that range from 20-year (CMORPH-CDR) to 35-year (PERSIANN-CDR, GPCP) record of daily adjusted global precipitation. The AMSU-A,B, Hydro-bundle is an 11-year record of daily rain rate over land and ocean, snow cover and surface temperature over land, and sea ice concentration, cloud liquid water, and total precipitable water over ocean among others. The aim of this work is to evaluate the ability of the different satellite QPE products to capture daily precipitation extremes. This evaluation will also include comparison with in-situ data sets at the daily scale from the Global Historical Climatology Network (GHCN-Daily), the Global Precipitation Climatology Centre (GPCC) gridded full data daily product, and the US Climate Reference Network (USCRN). In addition, while the products mentioned above only provide QPEs, the AMSU-A,B hydro-bundle provides additional hydrological information (precipitable water, cloud liquid water, snow cover, sea ice concentration). We will also present an analysis of those additional variables available from global satellite measurements and their relevance and complementarity in the context of long-term hydrological and climate studies.
Implementation of the Hammersmith Infant Neurological Exam in a High-Risk Infant Follow-Up Program
Maitre, Nathalie L; Chorna, Olena; Romeo, Domenico M; Guzzetta, Andrea
2017-01-01
Background High-Risk Infant Follow-Up (HRIF) programs provide early identification and referral for treatment of neurodevelopmental delays and impairments. In these programs, a standardized neurological exam is a critical component of evaluation for clinical and research purposes. Implementation To address primary challenges of provider educational diversity and standardized documentation, we designed an approach to training and implementation of the Hammersmith Infant Neurological Exam (HINE) with pre-course materials, a workshop model and adaptation of the electronic medical record. Conclusions Provider completion and documentation of a neurologic exam were evaluated before and after HINE training. Standardized training and implementation of the HINE in a large HRIF is feasible and effective and allows for quantitative evaluation of neurological findings and developmental trajectories. PMID:27765470
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
A preliminary study of DTI Fingerprinting on stroke analysis.
Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo
2014-01-01
DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.
Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan
2017-09-15
Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.
CASTIN: a system for comprehensive analysis of cancer-stromal interactome.
Komura, Daisuke; Isagawa, Takayuki; Kishi, Kazuki; Suzuki, Ryohei; Sato, Reiko; Tanaka, Mariko; Katoh, Hiroto; Yamamoto, Shogo; Tatsuno, Kenji; Fukayama, Masashi; Aburatani, Hiroyuki; Ishikawa, Shumpei
2016-11-09
Cancer microenvironment plays a vital role in cancer development and progression, and cancer-stromal interactions have been recognized as important targets for cancer therapy. However, identifying relevant and druggable cancer-stromal interactions is challenging due to the lack of quantitative methods to analyze whole cancer-stromal interactome. We present CASTIN (CAncer-STromal INteractome analysis), a novel framework for the evaluation of cancer-stromal interactome from RNA-Seq data using cancer xenograft models. For each ligand-receptor interaction which is derived from curated protein-protein interaction database, CASTIN summarizes gene expression profiles of cancer and stroma into three evaluation indices. These indices provide quantitative evaluation and comprehensive visualization of interactome, and thus enable to identify critical cancer-microenvironment interactions, which would be potential drug targets. We applied CASTIN to the dataset of pancreas ductal adenocarcinoma, and successfully characterized the individual cancer in terms of cancer-stromal relationships, and identified both well-known and less-characterized druggable interactions. CASTIN provides comprehensive view of cancer-stromal interactome and is useful to identify critical interactions which may serve as potential drug targets in cancer-microenvironment. CASTIN is available at: http://github.com/tmd-gpat/CASTIN .
Nie, Quandeng; Xu, Xiaoyi; Zhang, Qi; Ma, Yuying; Yin, Zheng; Shang, Luqing
2018-06-07
A three-dimensional quantitative structure-activity relationships model of enterovirus A71 3C protease inhibitors was constructed in this study. The protein-ligand interaction fingerprint was analyzed to generate a pharmacophore model. A predictive and reliable three-dimensional quantitative structure-activity relationships model was built based on the Flexible Alignment of AutoGPA. Moreover, three novel compounds (I-III) were designed and evaluated for their biochemical activity against 3C protease and anti-enterovirus A71 activity in vitro. III exhibited excellent inhibitory activity (IC 50 =0.031 ± 0.005 μM, EC 50 =0.036 ± 0.007 μM). Thus, this study provides a useful quantitative structure-activity relationships model to develop potent inhibitors for enterovirus A71 3C protease. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Nebogatikov, V.; Nikitiuk, A.; Konysheva, A.; Ignatyev, P.; Grishko, V.; Naimark, O.
2017-09-01
Quantitative phase microscopy is a new method to measure and evaluate the microlevel processes characterized by the high resolution and providing ample opportunities to quantitatively analyze various parameters, including specimens from biological matter. In this study, a laser interference microscope was used to evaluate the state of cancer cells (living and apoptotic). Apoptotic cancer cells were obtained by treatment of MCF-7 cells with the use of betulin-based α-bromomethyl ketone (BMK) derivative. When using the microscope, the main differences in the morphometric parameters of living and apoptotic cells such as height, diameter, perimeter, area and volume were appraised. The criteria that can be used as markers of apoptosis activation were identified.
Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi
The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
Tree Testing of Hierarchical Menu Structures for Health Applications
Le, Thai; Chaudhuri, Shomir; Chung, Jane; Thompson, Hilaire J; Demiris, George
2014-01-01
To address the need for greater evidence-based evaluation of Health Information Technology (HIT) systems we introduce a method of usability testing termed tree testing. In a tree test, participants are presented with an abstract hierarchical tree of the system taxonomy and asked to navigate through the tree in completing representative tasks. We apply tree testing to a commercially available health application, demonstrating a use case and providing a comparison with more traditional in-person usability testing methods. Online tree tests (N=54) and in-person usability tests (N=15) were conducted from August to September 2013. Tree testing provided a method to quantitatively evaluate the information structure of a system using various navigational metrics including completion time, task accuracy, and path length. The results of the analyses compared favorably to the results seen from the traditional usability test. Tree testing provides a flexible, evidence-based approach for researchers to evaluate the information structure of HITs. In addition, remote tree testing provides a quick, flexible, and high volume method of acquiring feedback in a structured format that allows for quantitative comparisons. With the diverse nature and often large quantities of health information available, addressing issues of terminology and concept classifications during the early development process of a health information system will improve navigation through the system and save future resources. Tree testing is a usability method that can be used to quickly and easily assess information hierarchy of health information systems. PMID:24582924
Evaluation of Historical and Projected Agricultural Climate Risk Over the Continental US
NASA Astrophysics Data System (ADS)
Zhu, X.; Troy, T. J.; Devineni, N.
2016-12-01
Food demands are rising due to an increasing population with changing food preferences, which places pressure on agricultural systems. In addition, in the past decade climate extremes have highlighted the vulnerability of our agricultural production to climate variability. Quantitative analyses in the climate-agriculture research field have been performed in many studies. However, climate risk still remains difficult to evaluate at large scales yet shows great potential of help us better understand historical climate change impacts and evaluate the future risk given climate projections. In this study, we developed a framework to evaluate climate risk quantitatively by applying statistical methods such as Bayesian regression, distribution fitting, and Monte Carlo simulation. We applied the framework over different climate regions in the continental US both historically and for modeled climate projections. The relative importance of any major growing season climate index, such as maximum dry period or heavy precipitation, was evaluated to determine what climate indices play a role in affecting crop yields. The statistical modeling framework was applied using county yields, with irrigated and rainfed yields separated to evaluate the different risk. This framework provides estimates of the climate risk facing agricultural production in the near-term that account for the full uncertainty of climate occurrences, range of crop response, and spatial correlation in climate. In particular, the method provides robust estimates of importance of irrigation in mitigating agricultural climate risk. The results of this study can contribute to decision making about crop choice and water use in an uncertain climate.
LABORATORY EVALUATION OF NONSTICK COATINGS TO REDUCE INK ADHESION TO PRINTING PRESS COMPONENTS
The report gives results of a project to identify surface coatings or preparations that might reduce and/or eliminate the need for solvent cleaning of ink-feed-system components of printing equipment. The study was designed to provide qualitative, but not quantitative, results. T...
ERIC Educational Resources Information Center
Brown, K. M.; Elliott, S. J.; Leatherdale, S. T.; Robertson-Wilson, J.
2015-01-01
The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing…
In the Face of Fallible AWE Feedback: How Do Students Respond?
ERIC Educational Resources Information Center
Bai, Lifang; Hu, Guangwei
2017-01-01
Automated writing evaluation (AWE) systems can provide immediate computer-generated quantitative assessments and qualitative diagnostic feedback on an enormous number of submitted essays. However, limited research attention has been paid to locally designed AWE systems used in English as a foreign language (EFL) classroom contexts. This study…
ERIC Educational Resources Information Center
Faupel-Badger, Jessica M.; Raue, Kimberley; Nelson, David E.; Tsakraklides, Sophia
2015-01-01
Published evaluations of career preparation of alumni from long-standing postdoctoral fellowship programs in the biomedical sciences are limited and often focus on quantitative analysis of data from extant publicly available sources. Qualitative methods provide the opportunity to gather robust information about specific program elements from…
Evaluating Sustainable Development Solutions Quantitatively: Competence Modelling for GCE and ESD
ERIC Educational Resources Information Center
Böhm, Marko; Eggert, Sabina; Barkmann, Jan; Bögeholz, Susanne
2016-01-01
To comprehensively address global environmental challenges such as biodiversity loss, citizens need an understanding of the socio-economic fundamentals of human behaviour in relation to natural resources. We argue that Global Citizenship Education and Education for Sustainable Development provide a core set of socio-economic competencies that can…
Environmental indices for common Michigan trees and shrubs.
Gary J. Brand
1985-01-01
Plants are indicators of environmental factors like moisture, nutrients, heat, and light. Semi-quantitative indices for these four factors were developed for 90 Michigan trees and shrubs. The indices and a tally of species present provide a simple evaluation of the environment of a forest stand and a useful management aid.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... Office of Research and Development. EPA is releasing this draft assessment for the purposes of public... health assessment program that evaluates quantitative and qualitative risk information on effects that..., EPA provides the highest quality science- based human health assessments to support the Agency's...
Using laser scanning cytometry to measure PPAR-mediated peroxisome proliferation and beta oxidation.
Pruimboom-Brees, Ingrid M; Brees, Dominique J J E; Shen, Amy C; Keener, Mary; Francone, Omar; Amacher, David E; Loy, James K; Kerlin, Roy L
2005-01-01
Laser scanning cytometry (LSC) is a new technology that combines the properties and advantages of flow cytometry (FC) and immunohistochemistry (IHC), thus providing qualitative and quantitative information on protein expression with the additional perspective provided by cell and tissue localization. Formalin-fixed, paraffin embedded liver sections from rats exposed to a Peroxisome Proliferator Activated Receptor (PPAR) agonist were stained with antibodies against peroxisomal targeting signal-1 (PTS-1) (a highly conserved tripeptide contained within all peroxisomal enzymes), Acyl CoA oxidase (AOX) (the rate limiting enzyme of peroxisomal beta oxidation), and catalase (an inducible peroxisomal antioxidant enzyme) to evaluate peroxisomal beta oxidation, oxidative stress, and peroxisome proliferation. The LSC showed increased AOX, catalase, and PTS-1 expression in centrilobular hepatocytes that correlated favorably with the microscopic observation of centrilobular hepatocellular hypertrophy and with the palmitoyl CoA biochemical assay for peroxisomal beta oxidation, and provided additional morphologic information about peroxisome proliferation and tissue patterns of activation. Therefore, the LSC provides qualitative and quantitative evaluation of peroxisome activity with similar sensitivity but higher throughput than the traditional biochemical methods. The additional benefits of the LSC include the direct correlation between histopathologic observations and peroxisomal alterations and the potential utilization of archived formalin-fixed tissues from a variety of organs and species.
Geisler, B P; Widerberg, K F; Berghöfer, A; Willich, S N
2010-01-01
This paper's aim is to identify existing and developing new concepts of organization, management, and leadership at a large European university hospital; and to evaluate whether mixed qualitative-quantitative methods with both internal and external input can provide helpful views of the possible future of large health care providers. Using the Delphi method in semi-structured, semi-quantitative interviews, with managers and employees as experts, the authors performed a vertical and a horizontal internal analysis. In addition, input from innovative faculties in other countries was obtained through structured power questions. These two sources were used to create three final scenarios, which evaluated using traditional strategic planning methods. There is found a collaboration scenario in which faculty and hospital are separated; a split scenario which divides the organization into three independent hospitals; and a corporation scenario in which corporate activities are bundled in three separate entities. In complex mergers of knowledge-driven organizations, the employees of the own organization (in addition to external consultants) might be tapped as a knowledge resource to successful future business models. The paper uses a real world consulting case to present a new set of methods for strategic planning in large health care provider organizations.
Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja
2016-11-01
To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.
NASA Astrophysics Data System (ADS)
Kahveci, Ajda
2010-07-01
In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.
A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.
Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui
2017-10-01
Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.
An anthropomorphic phantom for quantitative evaluation of breast MRI.
Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo
2011-02-01
In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.
Salisbury, Chris; Thomas, Clare; O'Cathain, Alicia; Rogers, Anne; Pope, Catherine; Yardley, Lucy; Hollinghurst, Sandra; Fahey, Tom; Lewis, Glyn; Large, Shirley; Edwards, Louisa; Rowsell, Alison; Segar, Julia; Brownsell, Simon; Montgomery, Alan A
2015-02-06
To develop a conceptual model for effective use of telehealth in the management of chronic health conditions, and to use this to develop and evaluate an intervention for people with two exemplar conditions: raised cardiovascular disease risk and depression. The model was based on several strands of evidence: a metareview and realist synthesis of quantitative and qualitative evidence on telehealth for chronic conditions; a qualitative study of patients' and health professionals' experience of telehealth; a quantitative survey of patients' interest in using telehealth; and review of existing models of chronic condition management and evidence-based treatment guidelines. Based on these evidence strands, a model was developed and then refined at a stakeholder workshop. Then a telehealth intervention ('Healthlines') was designed by incorporating strategies to address each of the model components. The model also provided a framework for evaluation of this intervention within parallel randomised controlled trials in the two exemplar conditions, and the accompanying process evaluations and economic evaluations. Primary care. The TElehealth in CHronic Disease (TECH) model proposes that attention to four components will offer interventions the best chance of success: (1) engagement of patients and health professionals, (2) effective chronic disease management (including subcomponents of self-management, optimisation of treatment, care coordination), (3) partnership between providers and (4) patient, social and health system context. Key intended outcomes are improved health, access to care, patient experience and cost-effective care. A conceptual model has been developed based on multiple sources of evidence which articulates how telehealth may best provide benefits for patients with chronic health conditions. It can be used to structure the design and evaluation of telehealth programmes which aim to be acceptable to patients and providers, and cost-effective. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Development of Nomarski microscopy for quantitative determination of surface topography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J. S.; Gordon, R. L.; Lessor, D. L.
1979-01-01
The use of Nomarski differential interference contrast (DIC) microscopy has been extended to provide nondestructive, quantitative analysis of a sample's surface topography. Theoretical modeling has determined the dependence of the image intensity on the microscope's optical components, the sample's optical properties, and the sample's surface orientation relative to the microscope. Results include expressions to allow the inversion of image intensity data to determine sample surface slopes. A commercial Nomarski system has been modified and characterized to allow the evaluation of the optical model. Data have been recorded with smooth, planar samples that verify the theoretical predictions.
Methods for collecting algal samples as part of the National Water-Quality Assessment Program
Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.
1993-01-01
Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.
Quantitative Imaging Biomarkers of NAFLD
Kinner, Sonja; Reeder, Scott B.
2016-01-01
Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588
Popelka, Stanislav; Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka
2016-01-01
The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.
Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka
2016-01-01
The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe. PMID:27087805
Quantitative methods for analysing cumulative effects on fish migration success: a review.
Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G
2012-07-01
It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
de Certaines, J D; Henriksen, O; Spisni, A; Cortsen, M; Ring, P B
1993-01-01
Quantitative magnetic resonance imaging may offer unique potential for tissue characterization in vivo. In this connection texture analysis of quantitative MR images may be of special importance. Because evaluation of texture analysis needs large data material, multicenter approaches become mandatory. Within the frame of BME Concerted Action on Tissue Characterization by MRI and MRS, a pilot multicenter study was launched in order to evaluate the technical problems including comparability of relaxation time measurements carried out in the individual sites. Human brain, skeletal muscle, and liver were used as models. A total of 218 healthy volunteers were studied. Fifteen MRI scanners with field strength ranging from 0.08 T to 1.5 T were induced. Measurement accuracy was tested on the Eurospin relaxation time test object (TO5) and the obtained calibration curve was used for correction of the in vivo data. The results established that, by following a standardized procedure, comparable quantitative measurements can be obtained in vivo from a number of MR sites. The overall variation coefficient in vivo was in the same order of magnitude as ex vivo relaxometry. Thus, it is possible to carry out international multicenter studies on quantitative imaging, provided that quality control with respect to measurement accuracy and calibration of the MR equipments are performed.
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
Extensions and evaluations of a general quantitative theory of forest structure and dynamics
Enquist, Brian J.; West, Geoffrey B.; Brown, James H.
2009-01-01
Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161
Primary production in the Delta: Then and now
Cloern, James E.; Robinson, April; Richey, Amy; Grenier, Letitia; Grossinger, Robin; Boyer, Katharyn E.; Burau, Jon; Canuel, Elizabeth A.; DeGeorge, John F.; Drexler, Judith Z.; Enright, Chris; Howe, Emily R.; Kneib, Ronald; Mueller-Solger, Anke; Naiman, Robert J.; Pinckney, James L.; Safran, Samuel M.; Schoellhamer, David H.; Simenstad, Charles A.
2016-01-01
To evaluate the role of restoration in the recovery of the Delta ecosystem, we need to have clear targets and performance measures that directly assess ecosystem function. Primary production is a crucial ecosystem process, which directly limits the quality and quantity of food available for secondary consumers such as invertebrates and fish. The Delta has a low rate of primary production, but it is unclear whether this was always the case. Recent analyses from the Historical Ecology Team and Delta Landscapes Project provide quantitative comparisons of the areal extent of 14 habitat types in the modern Delta versus the historical Delta (pre-1850). Here we describe an approach for using these metrics of land use change to: (1) produce the first quantitative estimates of how Delta primary production and the relative contributions from five different producer groups have been altered by large-scale drainage and conversion to agriculture; (2) convert these production estimates into a common currency so the contributions of each producer group reflect their food quality and efficiency of transfer to consumers; and (3) use simple models to discover how tidal exchange between marshes and open water influences primary production and its consumption. Application of this approach could inform Delta management in two ways. First, it would provide a quantitative estimate of how large-scale conversion to agriculture has altered the Delta's capacity to produce food for native biota. Second, it would provide restoration practitioners with a new approach—based on ecosystem function—to evaluate the success of restoration projects and gauge the trajectory of ecological recovery in the Delta region.
Reference condition approach to restoration planning
Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.
2010-01-01
Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.
Using a Smart Phone as a Standalone Platform for Detection and Monitoring of Pathological Tremors
Daneault, Jean-François; Carignan, Benoit; Codère, Carl Éric; Sadikot, Abbas F.; Duval, Christian
2013-01-01
Introduction: Smart phones are becoming ubiquitous and their computing capabilities are ever increasing. Consequently, more attention is geared toward their potential use in research and medical settings. For instance, their built-in hardware can provide quantitative data for different movements. Therefore, the goal of the current study was to evaluate the capabilities of a standalone smart phone platform to characterize tremor. Results: Algorithms for tremor recording and online analysis can be implemented within a smart phone. The smart phone provides reliable time- and frequency-domain tremor characteristics. The smart phone can also provide medically relevant tremor assessments. Discussion: Smart phones have the potential to provide researchers and clinicians with quantitative short- and long-term tremor assessments that are currently not easily available. Methods: A smart phone application for tremor quantification and online analysis was developed. Then, smart phone results were compared to those obtained simultaneously with a laboratory accelerometer. Finally, results from the smart phone were compared to clinical tremor assessments. PMID:23346053
Comparison of 18F-FDG PET/CT and PET/MRI in patients with multiple myeloma
Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Mosebach, Jennifer; Pan, Leyun; Schlemmer, Heinz-Peter; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2015-01-01
PET/MRI represents a promising hybrid imaging modality with several potential clinical applications. Although PET/MRI seems highly attractive in the diagnostic approach of multiple myeloma (MM), its role has not yet been evaluated. The aims of this prospective study are to evaluate the feasibility of 18F-FDG PET/MRI in detection of MM lesions, and to investigate the reproducibility of bone marrow lesions detection and quantitative data of 18F-FDG uptake between the functional (PET) component of PET/CT and PET/MRI in MM patients. The study includes 30 MM patients. All patients initially underwent 18F-FDG PET/CT (60 min p.i.), followed by PET/MRI (120 min p.i.). PET/CT and PET/MRI data were assessed and compared based on qualitative (lesion detection) and quantitative (SUV) evaluation. The hybrid PET/MRI system provided good image quality in all cases without artefacts. PET/MRI identified 65 of the 69 lesions, which were detectable with PET/CT (94.2%). Quantitative PET evaluations showed the following mean values in MM lesions: SUVaverage=5.5 and SUVmax=7.9 for PET/CT; SUVaverage=3.9 and SUVmax=5.8 for PET/MRI. Both SUVaverage and SUVmax were significantly higher on PET/CT than on PET/MRI. Spearman correlation analysis demonstrated a strong correlation between both lesional SUVaverage (r=0.744) and lesional SUVmax (r=0.855) values derived from PET/CT and PET/MRI. Regarding detection of myeloma skeletal lesions, PET/MRI exhibited equivalent performance to PET/CT. In terms of tracer uptake quantitation, a significant correlation between the two techniques was demonstrated, despite the statistically significant differences in lesional SUVs between PET/CT and PET/MRI. PMID:26550538
Applications of Doppler ultrasound in clinical vascular disease
NASA Technical Reports Server (NTRS)
Barnes, R. W.; Hokanson, D. E.; Sumner, D. S.; Strandness, D. E., Jr.
1975-01-01
Doppler ultrasound has become the most useful and versatile noninvasive technique for objective evaluation of clinical vascular disease. Commercially available continuous-wave instruments provide qualitative and quantitative assessment of venous and arterial disease. Pulsed Doppler ultrasound was developed to provide longitudinal and transverse cross-sectional images of the arterial lumen with a resolution approaching that of conventional X-ray techniques. Application of Doppler ultrasound in venous, peripheral arterial, and cerebrovascular diseases is reviewed.
Towards SDS (Strategic Defense System) Testing and Evaluation: A collection of Relevant Topics
1989-07-01
the proof of the next. 89 The Piton project is the first instance of stacking.two verified components. In 1985 Warren...Accelerated? In the long term, a vast amount of work needs to be done. Below are some miscellaneous, fairly near term projects which would seem to provide...and predictions for the current project . It provides a quantitative analysis of the environment and a model of the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
Fujiwara, Yasuhiro; Maruyama, Hirotoshi; Toyomaru, Kanako; Nishizaka, Yuri; Fukamatsu, Masahiro
2018-06-01
Magnetic resonance imaging (MRI) is widely used to detect carotid atherosclerotic plaques. Although it is important to evaluate vulnerable carotid plaques containing lipids and intra-plaque hemorrhages (IPHs) using T 1 -weighted images, the image contrast changes depending on the imaging settings. Moreover, to distinguish between a thrombus and a hemorrhage, it is useful to evaluate the iron content of the plaque using both T 1 -weighted and T 2 *-weighted images. Therefore, a quantitative evaluation of carotid atherosclerotic plaques using T 1 and T 2 * values may be necessary for the accurate evaluation of plaque components. The purpose of this study was to determine whether the multi-echo phase-sensitive inversion recovery (mPSIR) sequence can improve T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of an IPH. T 1 and T 2 * values measured using mPSIR were compared to values from conventional methods in phantom and in vivo studies. In the phantom study, the T 1 and T 2 * values estimated using mPSIR were linearly correlated with those of conventional methods. In the in vivo study, mPSIR demonstrated higher T 1 contrast between the IPH phantom and sternocleidomastoid muscle than the conventional method. Moreover, the T 1 and T 2 * values of the blood vessel wall and sternocleidomastoid muscle estimated using mPSIR were correlated with values measured by conventional methods and with values reported previously. The mPSIR sequence improved T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of the neck region. Although further study is required to evaluate the clinical utility, mPSIR may improve carotid atherosclerotic plaque detection and provide detailed information about plaque components.
Using Mixed Methods and Collaboration to Evaluate an Education and Public Outreach Program (Invited)
NASA Astrophysics Data System (ADS)
Shebby, S.; Shipp, S. S.
2013-12-01
Traditional indicators (such as the number of participants or Likert-type ratings of participant perceptions) are often used to provide stakeholders with basic information about program outputs and to justify funding decisions. However, use of qualitative methods can strengthen the reliability of these data and provide stakeholders with more meaningful information about program challenges, successes, and ultimate impacts (Stern, Stame, Mayne, Forss, David & Befani, 2012). In this session, presenters will discuss how they used a mixed methods evaluation to determine the impact of an education and public outreach (EPO) program. EPO efforts were intended to foster more effective, sustainable, and efficient utilization of science discoveries and learning experiences through three main goals 1) increase engagement and support by leveraging of resources, expertise, and best practices; 2) organize a portfolio of resources for accessibility, connectivity, and strategic growth; and 3) develop an infrastructure to support coordination. The evaluation team used a mixed methods design to conduct the evaluation. Presenters will first discuss five potential benefits of mixed methods designs: triangulation of findings, development, complementarity, initiation, and value diversity (Greene, Caracelli & Graham, 2005). They will next demonstrate how a 'mix' of methods, including artifact collection, surveys, interviews, focus groups, and vignettes, was included in the EPO project's evaluation design, providing specific examples of how alignment between the program theory and the evaluation plan was best achieved with a mixed methods approach. The presentation will also include an overview of different mixed methods approaches and information about important considerations when using a mixed methods design, such as selection of data collection methods and sources, and the timing and weighting of quantitative and qualitative methods (Creswell, 2003). Ultimately, this presentation will provide insight into how a mixed methods approach was used to provide stakeholders with important information about progress toward program goals. Creswell, J.W. (2003). Research design: Qualitative, quantitative, and mixed approaches. Thousand Oaks, CA: Sage. Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. Stern, E; Stame, N; Mayne, J; Forss, K; Davis, R & Befani, B (2012) Broadening the range of designs and methods for impact evaluation. Department for International Development.
NASA Astrophysics Data System (ADS)
Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.
2017-11-01
The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.
Does team training work? Principles for health care.
Salas, Eduardo; DiazGranados, Deborah; Weaver, Sallie J; King, Heidi
2008-11-01
Teamwork is integral to a working environment conducive to patient safety and care. Team training is one methodology designed to equip team members with the competencies necessary for optimizing teamwork. There is evidence of team training's effectiveness in highly complex and dynamic work environments, such as aviation and health care. However, most quantitative evaluations of training do not offer any insight into the actual reasons why, how, and when team training is effective. To address this gap in understanding, and to provide guidance for members of the health care community interested in implementing team training programs, this article presents both quantitative results and a specific qualitative review and content analysis of team training implemented in health care. Based on this review, we offer eight evidence-based principles for effective planning, implementation, and evaluation of team training programs specific to health care.
ERIC Educational Resources Information Center
Jitendra, Asha K.; Lein, Amy E.; Im, Soo-hyun; Alghamdi, Ahmed A.; Hefte, Scott B.; Mouanoutoua, John
2018-01-01
This meta-analysis is the first to provide a quantitative synthesis of empirical evaluations of mathematical intervention programs implemented in secondary schools for students with learning disabilities and mathematics difficulties. Included studies used a treatment-control group design. A total of 19 experimental and quasi-experimental studies…
ERIC Educational Resources Information Center
Thomas, Sarah; Grimes, Darren
2003-01-01
Graduate apprenticeships in a British college's hospitality management course involved integration of key skills and National Vocational Qualifications units. Qualitative and quantitative data from seven students indicated they felt that integration enabled formal recognition of competency, provided valuable managerial experience, and facilitated…
ERIC Educational Resources Information Center
Levitt-Merin, Marta; Sutter, Sharon Kingdon
This final report provides a descriptive overview of three approaches which the Hawaii Demonstration Project initiated to reduce unintended teenage pregnancies. Project evaluation findings are summarized; both qualitative and quantitative data are presented for a comprehensive picture of the project and its input. Project limitations and successes…
50 CFR 600.310 - National Standard 1-Optimum Yield.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... At the time a stock complex is established, the FMP should provide a full and explicit description of... complex. When indicator stock(s) are used, periodic re-evaluation of available quantitative or qualitative... sufficiently to allow rebuilding within an acceptable time frame (also see paragraph (j)(3)(ii) of this section...
50 CFR 600.310 - National Standard 1-Optimum Yield.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... At the time a stock complex is established, the FMP should provide a full and explicit description of... complex. When indicator stock(s) are used, periodic re-evaluation of available quantitative or qualitative... sufficiently to allow rebuilding within an acceptable time frame (also see paragraph (j)(3)(ii) of this section...
50 CFR 600.310 - National Standard 1-Optimum Yield.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... At the time a stock complex is established, the FMP should provide a full and explicit description of... complex. When indicator stock(s) are used, periodic re-evaluation of available quantitative or qualitative... sufficiently to allow rebuilding within an acceptable time frame (also see paragraph (j)(3)(ii) of this section...
50 CFR 600.310 - National Standard 1-Optimum Yield.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... At the time a stock complex is established, the FMP should provide a full and explicit description of... complex. When indicator stock(s) are used, periodic re-evaluation of available quantitative or qualitative... sufficiently to allow rebuilding within an acceptable time frame (also see paragraph (j)(3)(ii) of this section...
50 CFR 600.310 - National Standard 1-Optimum Yield.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... At the time a stock complex is established, the FMP should provide a full and explicit description of... complex. When indicator stock(s) are used, periodic re-evaluation of available quantitative or qualitative... sufficiently to allow rebuilding within an acceptable time frame (also see paragraph (j)(3)(ii) of this section...
The Fathering Indicators Framework: A Tool for Quantitative and Qualitative Analysis.
ERIC Educational Resources Information Center
Gadsden, Vivian, Ed.; Fagan, Jay, Ed.; Ray, Aisha, Ed.; Davis, James Earl, Ed.
The Fathering Indicators Framework (FIF) is an evaluation tool designed to help researchers, practitioners, and policymakers conceptualize, examine, and measure change in fathering behaviors in relation to child and family well-being. This report provides a detailed overview of the research and theory informing the development of the FIF. The FIF…
ERIC Educational Resources Information Center
Rivizzigno, Victoria L.
This exercise teaches undergraduate geography students to use the Lorenz Curve and the Index of Dissimilarity to assess the spatial distributions of the White, Black, and American Indian populations of the United States in 1980. Specific procedures for implementing the exercise are provided; solutions to the exercise are also included. Students…
Assessing Motivation To Read. Instructional Resource No. 14.
ERIC Educational Resources Information Center
Gambrell, Linda B.; And Others
The Motivation to Read Profile (MRP) is a public-domain instrument designed to provide teachers with an efficient and reliable way to assess reading motivation qualitatively and quantitatively by evaluating students' self-concept as readers and the value they place on reading. The MRP consists of two basic instruments: the Reading Survey (a…
ERIC Educational Resources Information Center
Calgary Univ. (Alberta).
This report describes a pilot energy conservation project in Grande Prairie (Alberta) School District No. 2357. Extensive data collection and analysis were undertaken to provide a sound, quantitative basis for evaluation of the program. Energy conserving measures requiring capital outlays were not considered. During the project, electric demand…
Relationships and Dropping Out: The Voice of At-Risk Youth
ERIC Educational Resources Information Center
Henry, Robert Thomas
2009-01-01
The practice of retaining students in grade has been studied, researched, discussed, criticized and yet it continues. Dropping out of school prior to graduation has been studied, researched, discussed, written about and continues to be practiced by our youth. Policymakers are often provided quantitative data to consider as they explore, evaluate,…
"Economics with Training Wheels": Using Blogs in Teaching and Assessing Introductory Economics
ERIC Educational Resources Information Center
Cameron, Michael P.
2012-01-01
Blogs provide a dynamic interactive medium for online discussion, consistent with communal constructivist pedagogy. The author of this article describes and evaluates a blog assignment used in the teaching and assessment of a small (40-60 students) introductory economics course. Using qualitative and quantitative data collected across four…
Reliability and precision of pellet-group counts for estimating landscape-level deer density
David S. deCalesta
2013-01-01
This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...
The Toxicity Reference Database (ToxRefDB) is a publicly accessible resource that contains 40+ years of in vivo dose-response toxicological studies. ToxRefDB provides curated in vivo toxicity data for systematic evaluation of a continuously expanding catalog of chemicals, and co...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... Environmental Assessment (NCEA) within the EPA Office of Research and Development (ORD). EPA is releasing this... Program is a human health assessment program that evaluates quantitative and qualitative risk information... IRIS Program, EPA provides the highest quality science-based human health assessments to support the...
ON THE QUANTITATIVE EVALUATION OF THE TERMINOLOGY OF A VOCABULARY.
ERIC Educational Resources Information Center
KRAVETS, L.G.
THE CREATION OF AN INDUSTRIAL SYSTEM OF MACHINE TRANSLATION WITH AUTOMATIC INDEXING OF THE TRANSLATED MATERIALS PRESUMES THE DEVELOPMENT OF DICTIONARIES WHICH PROVIDE FOR THE IDENTIFICATION OF KEY WORDS AND WORD COMBINATIONS, FOLLOWED BY THEIR TRANSLATION INTO THE DESCRIPTORS OF THE SEARCH LANGUAGE. THREE SIGNS WHICH SHOW THAT A GIVEN WORD IS A…
Poeppl, Timm B.; Langguth, Berthold; Laird, Angela R.; Eickhoff, Simon B.
2016-01-01
Reproductive behavior is mandatory for conservation of species and mediated by a state of sexual arousal (SA), involving both complex mental processes and bodily reactions. An early neurobehavioral model of SA proposes cognitive, emotional, motivational, and autonomic components. In a comprehensive quantitative meta-analysis on previous neuroimaging findings, we provide here evidence for distinct brain networks underlying psychosexual and physiosexual arousal. Psychosexual (i.e., mental sexual) arousal recruits brain areas crucial for cognitive evaluation, top-down modulation of attention and exteroceptive sensory processing, relevance detection and affective evaluation, as well as regions implicated in the representation of urges and in triggering autonomic processes. In contrast, physiosexual (i.e., physiological sexual) arousal is mediated by regions responsible for regulation and monitoring of initiated autonomic processes and emotions and for somatosensory processing. These circuits are interconnected by subcortical structures (putamen and claustrum) that provide exchange of sensorimotor information and crossmodal processing between and within the networks. Brain deactivations may imply attenuation of introspective processes and social cognition, but be necessary to release intrinsic inhibition of SA. PMID:23674246
Model-assisted development of a laminography inspection system
NASA Astrophysics Data System (ADS)
Grandin, R.; Gray, J.
2012-05-01
Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.
Pollen preservation and Quaternary environmental history in the southeastern United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delcourt, P.A.; Delcourt, H.R.
Reconstructions of Quaternary environmental history based upon modern pollen/vegetation/climate calibrations are more tenable if the factors responsible for variation in pollen assemblages are evaluated. Examination of the state of preservation of Quaternary palynomorphs provides quantitative data concerning the degree of information loss due to alteration of pollen assemblages by syndepositional and post-depositional deterioration. The percentage, concentration, and influx values for total indeterminable pollen are useful criteria in providing an objective and quantitative basis for evaluating the comparability of pollen spectra within and between sites. Supporting data concerning sediment particle-size distribution, organic matter content, and concentration, influx, and taxonomic composition ofmore » both determinable pollen and plant macrofossils aid in reconstructing past depositional environments. The potential is high for deterioration of pollen in sediments from the southeastern United States, although considerable variation is found in both kind and degree of deterioration between lacustrine and alluvial sites of different ages and in different latitudes. Modern analogs are a basis for late Quaternary environmental reconstructions when pollen deterioration has not significantly biased the information content of fossil pollen assemblages.« less
Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549
Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.
Generalized PSF modeling for optimized quantitation in PET imaging.
Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman
2017-06-21
Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.
Manned Versus Unmanned Risk and Complexity Considerations for Future Midsized X-Planes
NASA Technical Reports Server (NTRS)
Lechniak, Jason A.; Melton, John E.
2017-01-01
The objective of this work was to identify and estimate complexity and risks associated with the development and testing of new low-cost medium-scale X-plane aircraft primarily focused on air transport operations. Piloting modes that were evaluated for this task were manned, remotely piloted, and unmanned flight research programs. This analysis was conducted early in the data collection period for X-plane concept vehicles before preliminary designs were complete. Over 50 different aircraft and system topics were used to evaluate the three piloting control modes. Expert group evaluations from a diverse set of pilots, engineers, and other experts at Aeronautics Research Mission Directorate centers within the National Aeronautics and Space Administration provided qualitative reasoning on the many issues surrounding the decisions regarding piloting modes. The group evaluations were numerically rated to evaluate each topic quantitatively and were used to provide independent criteria for vehicle complexity and risk. An Edwards Air Force Base instruction document was identified that emerged as a source of the effects found in our qualitative and quantitative data. The study showed that a manned aircraft was the best choice to align with test activities for transport aircraft flight research from a low-complexity and low-risk perspective. The study concluded that a manned aircraft option would minimize the risk and complexity to improve flight-test efficiency and bound the cost of the flight-test portion of the program. Several key findings and discriminators between the three modes are discussed in detail.
[Evaluation by case managers dementia : An explorative practice based study on types and content].
Ketelaar, Nicole A B M; Jukema, Jan S; van Bemmel, Marlies; Adriaansen, Marian J M; Smits, Carolien H M
2017-06-01
This practice based explorative study aims to provide insight into the ways in which case managers shape and fill up the evaluation phase of their support of the informal care network of persons with dementia. A combination of quantitative and qualitative research methods were used. A group of 57 case managers of persons with dementia in three different organisational networks took part in this study. Results from the quantitative and qualitative data are organized into four themes: (1) attitude towards evaluation, (2) forms of evaluation, (3) implementation of evaluation and (4) content of evaluation. There are different ways in shaping evaluation and the content of it. The importance of interim and final evaluation is recognized, but is difficult to realize in a methodical way. Barriers experienced by the case managers include various factors associated both with clients as professionals. Case managers evaluate continuously and in an informal way to assess whether the extent of their assistance is meeting the needs of the client and informal network. Case managers do not use systematic evaluation to measure the quality of care they offer to persons with dementia and their caregivers. The findings demand a discussion on the level of clients, as well as on the professional and societal level about the way case managers should evaluate their support.
Snijder, Mieke; Shakeshaft, Anthony; Wagemakers, Annemarie; Stephens, Anne; Calabria, Bianca
2015-11-21
Community development is a health promotion approach identified as having great potential to improve Indigenous health, because of its potential for extensive community participation. There has been no systematic examination of the extent of community participation in community development projects and little analysis of their effectiveness. This systematic review aims to identify the extent of community participation in community development projects implemented in Australian Indigenous communities, critically appraise the qualitative and quantitative methods used in their evaluation, and summarise their outcomes. Ten electronic peer-reviewed databases and two electronic grey literature databases were searched for relevant studies published between 1990 and 2015. The level of community participation and the methodological quality of the qualitative and quantitative components of the studies were assessed against standardised criteria. Thirty one evaluation studies of community development projects were identified. Community participation varied between different phases of project development, generally high during project implementation, but low during the evaluation phase. For the majority of studies, methodological quality was low and the methods were poorly described. Although positive qualitative or quantitative outcomes were reported in all studies, only two studies reported statistically significant outcomes. Partnerships between researchers, community members and service providers have great potential to improve methodological quality and community participation when research skills and community knowledge are integrated to design, implement and evaluate community development projects. The methodological quality of studies evaluating Australian Indigenous community development projects is currently too weak to confidently determine the cost-effectiveness of community development projects in improving the health and wellbeing of Indigenous Australians. Higher quality studies evaluating community development projects would strengthen the evidence base.
Study On The Application Of CBERS-02B To Quantitative Soil Erosion Monitoring
NASA Astrophysics Data System (ADS)
Shi, Mingchang; Xu, Jing; Wang, Lei; Wang, Xiaoyun; Mu, Jing
2010-10-01
Currently, the reduction of soil erosion is an important prerequisite for achieving ecological security. Since real-time and quantitative evaluation on regional soil erosion plays a significant role in reducing the soil erosion, soil erosion models are more and more widely used. Based on RUSLE model, this paper carries out the quantitative soil erosion monitoring in the Xi River Basin and its surrounding areas by using CBERS-02B CCD, DEM, TRMM and other data. Besides, it performs the validation for monitoring results by using remote sensing investigation results in 2005. The monitoring results show that in 2009, the total amount of soil erosion in the study area was 1.94×106t, the erosion area was 2055.2km2 (54.06% of the total area), and the average soil erosion modulus was 509.7t km-2 a-1. As a case using CBERS-02B data for quantitative soil erosion monitoring, this study provides experience on the application of CBERS-02B data in the field of quantitative soil erosion monitoring and also for local soil erosion management.
The Comprehensive Benefit Evaluation of Take Shared Bicycles as Connecting to Public Transit
NASA Astrophysics Data System (ADS)
Zhang, J. Y.; Sun, H.; Li, P. F.; Li, C. C.
2017-10-01
Shared bicycles as an important way of connecting public transport, have few literature to evaluate its effectiveness.This paper taking Beijing city as an example, make anevaluationfor the six types of travel combinations which are commonly used by the citizens. The author selects four quantitative indicators: fuel consumption, economic cost, total time spent, and CO2 emission. And two qualitative indicators: degree of comfort and convenience. The relative efficiency of quantitative indicators is obtained by data envelopment analysis (DEA) and fuzzification and then take fuzzy synthetic evaluation with qualitative indicators.It was found that the choice of shared bicycles +subway+ shared bicycles and shared bicycles has good comprehensive benefits in medium distance travel. The findings also suggest that shared bicycles +subway+ shared bicycles is the best choice in the utilitarian trips. The conclusions not only provide suggestions for the travellers to select travel modes, but also can adjust the relevant factors to increase the proportion of green travel.
NIST Efforts to Quality-Assure Gunpowder Measurements
NASA Technical Reports Server (NTRS)
MacCrehan, William A.; Reardon, Michelle R.
2000-01-01
In the past few years, the National Institute for Standards and Technology (NIST) has been promoting the idea of quantitatively determining the additives in smokeless gunpowder using micellar capillary electrophoresis as a means of investigating the criminal use of hand guns and pipe bombs. As a part of this effort, we have evaluated both supercritical fluid and ultrasonic solvent extractions for the quantitative recovery of nitroglycerin (NG), diphenylamine (DPA), N-nitrosodiphenylamine (NnDPA), and ethyl centralite (EC) from gunpowder. Recoveries were evaluated by repeat extraction and matrix spiking experiments. The final extraction protocol provides greater than 95 percent recoveries. To help other researches validate their own analytical method for additive determinations, NIST is exploring the development of a standard reference material, Additives in Smokeless Gunpowder. The evaluated method is being applied to two double-base (NG-containing) powders, one stabilized with diphenylamine and the other with ethyl centralite. As part of this reference material development effort, we are conducting an interlaboratory comparison exercise among the forensic and military gunpowder measurement community.
Student Experiments on the Effects of Dam Removal on the Elwha River
NASA Astrophysics Data System (ADS)
Sandland, T. O.; Grack Nelson, A. L.
2006-12-01
The National Center for Earth Surface Dynamics (NCED) is an NSF funded Science and Technology Center devoted to developing a quantitative, predictive science of the ecological and physical processes that define and shape rivers and river networks. The Science Museum of Minnesota's (SMM) Earthscapes River Restoration classes provide k-12 students, teachers, and the public opportunities to explore NCED concepts and, like NCED scientists, move from a qualitative to a quantitative-based understanding of river systems. During a series of classes, students work with an experimental model of the Elwha River in Washington State to gain an understanding of the processes that define and shape river systems. Currently, two large dams on the Elwha are scheduled for removal to restore salmon habitat. Students design different dam removal scenarios to test and make qualitative observations describing and comparing how the modeled system evolves over time. In a following session, after discussing the ambiguity of the previous session's qualitative data, student research teams conduct a quantitative experiment to collect detailed measurements of the system. Finally, students interpret, critique, and compare the data the groups collected and ultimately develop and advocate a recommendation for the "ideal" dam removal scenario. SMM is currently conducting a formative evaluation of River Restoration classes to improve their educational effectiveness and guide development of an educator's manual. As of August 2006, pre- and post-surveys have been administered to 167 students to gauge student learning and engagement. The surveys have found the program successful in teaching students why scientists use river models and what processes and phenomena are at work in river systems. Most notable is the increase in student awareness of sediment in river systems. A post-visit survey was also administered to 20 teachers who used the models in their classrooms. This survey provided feedback about teachers' experience with the program and will help inform the development of a future educator's manual. All teachers found the program to be effective at providing opportunities for students to make qualitative observations and most (95%) found the program effective at providing students opportunities to make quantitative measurements. A full summary of evaluation results will be shared at the meeting.
Nieuwsma, Jason A; Jackson, George L; DeKraai, Mark B; Bulling, Denise J; Cantrell, William C; Rhodes, Jeffrey E; Bates, Mark J; Ethridge, Keith; Lane, Marian E; Tenhula, Wendy N; Batten, Sonja V; Meador, Keith G
2014-12-01
Recognizing that clergy and spiritual care providers are a key part of mental health care systems, the Department of Veterans Affairs (VA) and Department of Defense (DoD) jointly examined chaplains' current and potential roles in caring for veterans and service members with mental health needs. Our aim was to evaluate the intersection of chaplain and mental health care practices in VA and DoD in order to determine if improvement is needed, and if so, to develop actionable recommendations as indicated by evaluation findings. A 38-member multidisciplinary task group partnered with researchers in designing, implementing, and interpreting a mixed methods study that included: 1) a quantitative survey of VA and DoD chaplains; and 2) qualitative interviews with mental health providers and chaplains. Quantitative: the survey included all full-time VA chaplains and all active duty military chaplains (n = 2,163 completed of 3,464 invited; 62 % response rate). Qualitative: a total of 291 interviews were conducted with mental health providers and chaplains during site visits to 33 VA and DoD facilities. Quantitative: the online survey assessed intersections between chaplaincy and mental health care and took an average of 37 min to complete. Qualitative: the interviews assessed current integration of mental health and chaplain services and took an average of 1 h to complete. When included on interdisciplinary mental health care teams, chaplains feel understood and valued (82.8-100 % of chaplains indicated this, depending on the team). However, findings from the survey and site visits suggest that integration of services is often lacking and can be improved. Closely coordinating with a multidisciplinary task group in conducting a mixed method evaluation of chaplain-mental health integration in VA and DoD helped to ensure that researchers assessed relevant domains and that findings could be rapidly translated into actionable recommendations.
Evaluation of TRMM Ground-Validation Radar-Rain Errors Using Rain Gauge Measurements
NASA Technical Reports Server (NTRS)
Wang, Jianxin; Wolff, David B.
2009-01-01
Ground-validation (GV) radar-rain products are often utilized for validation of the Tropical Rainfall Measuring Mission (TRMM) spaced-based rain estimates, and hence, quantitative evaluation of the GV radar-rain product error characteristics is vital. This study uses quality-controlled gauge data to compare with TRMM GV radar rain rates in an effort to provide such error characteristics. The results show that significant differences of concurrent radar-gauge rain rates exist at various time scales ranging from 5 min to 1 day, despite lower overall long-term bias. However, the differences between the radar area-averaged rain rates and gauge point rain rates cannot be explained as due to radar error only. The error variance separation method is adapted to partition the variance of radar-gauge differences into the gauge area-point error variance and radar rain estimation error variance. The results provide relatively reliable quantitative uncertainty evaluation of TRMM GV radar rain estimates at various times scales, and are helpful to better understand the differences between measured radar and gauge rain rates. It is envisaged that this study will contribute to better utilization of GV radar rain products to validate versatile spaced-based rain estimates from TRMM, as well as the proposed Global Precipitation Measurement, and other satellites.
Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing
Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak
2012-01-01
This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700
Advances in Imaging Approaches to Fracture Risk Evaluation
Manhard, Mary Kate; Nyman, Jeffry S.; Does, Mark D.
2016-01-01
Fragility fractures are a growing problem worldwide, and current methods for diagnosing osteoporosis do not always identify individuals who require treatment to prevent a fracture and may misidentify those not a risk. Traditionally, fracture risk is assessed using dual-energy X-ray absorptiometry, which provides measurements of areal bone mineral density (BMD) at sites prone to fracture. Recent advances in imaging show promise in adding new information that could improve the prediction of fracture risk in the clinic. As reviewed herein, advances in quantitative computed tomography (QCT) predict hip and vertebral body strength; high resolution HR-peripheral QCT (HR-pQCT) and micro-magnetic resonance imaging (μMRI) assess the micro-architecture of trabecular bone; quantitative ultrasound (QUS) measures the modulus or tissue stiffness of cortical bone; and quantitative ultra-short echo time MRI methods quantify the concentrations of bound water and pore water in cortical bone, which reflect a variety of mechanical properties of bone. Each of these technologies provides unique characteristics of bone and may improve fracture risk diagnoses and reduce prevalence of fractures by helping to guide treatment decisions. PMID:27816505
Evaluating biomarkers for prognostic enrichment of clinical trials.
Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R
2017-12-01
A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.
Nondestructive evaluation using dipole model analysis with a scan type magnetic camera
NASA Astrophysics Data System (ADS)
Lee, Jinyi; Hwang, Jiseong
2005-12-01
Large structures such as nuclear power, thermal power, chemical and petroleum refining plants are drawing interest with regard to the economic aspect of extending component life in respect to the poor environment created by high pressure, high temperature, and fatigue, securing safety from corrosion and exceeding their designated life span. Therefore, technology that accurately calculates and predicts degradation and defects of aging materials is extremely important. Among different methods available, nondestructive testing using magnetic methods is effective in predicting and evaluating defects on the surface of or surrounding ferromagnetic structures. It is important to estimate the distribution of magnetic field intensity for applicable magnetic methods relating to industrial nondestructive evaluation. A magnetic camera provides distribution of a quantitative magnetic field with a homogeneous lift-off and spatial resolution. It is possible to interpret the distribution of magnetic field when the dipole model was introduced. This study proposed an algorithm for nondestructive evaluation using dipole model analysis with a scan type magnetic camera. The numerical and experimental considerations of the quantitative evaluation of several sizes and shapes of cracks using magnetic field images of the magnetic camera were examined.
Nimpagaritse, Manassé; Korachais, Catherine; Roberfroid, Dominique; Kolsteren, Patrick; Zine Eddine El Idrissi, Moulay Driss; Meessen, Bruno
2016-06-14
Malnutrition is a huge problem in Burundi. In order to improve the provision of services at hospital, health centre and community levels, the Ministry of Health is piloting the introduction of malnutrition prevention and care indicators within its performance based financing (PBF) scheme. Paying for units of services and for qualitative indicators is expected to enhance provision and quality of these nutrition services, as PBF has done, in Burundi and elsewhere, for several other services. This paper presents the protocol for the impact evaluation of the PBF scheme applied to malnutrition. The research design consists in a mixed methods model adopting a sequential explanatory design. The quantitative component is a cluster-randomized controlled evaluation design: among the 90 health centres selected for the study, half receive payment related to their results in malnutrition activities, while the other half get a budget allocation. Qualitative research will be carried out both during the intervention period and at the end of the quantitative evaluation. Data are collected from 1) baseline and follow-up surveys of 90 health centres and 6,480 households with children aged 6 to 23 months, 2) logbooks filled in weekly in health centres, and 3) in-depth interviews and focus group discussions. The evaluation aims to provide the best estimate of the impact of the project on malnutrition outcomes in the community as well as outputs at the health centre level (malnutrition care outputs) and to describe quantitatively and qualitatively the changes that took place (or did not take place) within health centres as a result of the program. Although PBF schemes are blooming in low in-come countries, there is still a need for evidence, especially on the impact of revising the list of remunerated indicators. It is expected that this impact evaluation will be helpful for the national policy dialogue in Burundi, but it will also provide key evidence for countries with an existing PBF scheme and confronted with malnutrition problems on the appropriateness to extend the strategy to nutrition services. ClinicalTrials.gov PRS Identifier: NCT02721160; registered March 2016.
The clinical use of dynamic posturography in the elderly.
Shepard, N T
1989-12-01
We provide an overview of the clinical uses of dynamic posturography. Although the equipment described to perform this testing is expensive, the concepts, especially those for sensory organization, can be applied for +20.00. To apply the six sensory organization conditions, one merely needs some way to disrupt proprioceptive information by maintaining ankle angle and providing for visual conflict stimuli. We found that proprioceptive information can be disrupted easily by asking the patient to stand on a thick (4-inch) dense piece of foam rubber like that used in cushions for furniture. Visual stabilization conflict can be provided by having the patient wear a 19- to 20-inch Japanese lantern with a head-mounting system in the center so that the patient's movements do not reflect themselves in relative movements to the visual environment. With use of these two simple tools, the six sensory organization tests can be approximated in a clinical situation in a short time and can provide some relative information about a patient's postural control capabilities. With minor additional work, a quantitative measure of output that gives indications of the amount of anterior-posterior sway also can be provided. For elderly patients with a variety of problems ranging from general unsteadiness to frank vertigo, the risk of falling can be devastating, and it is important to provide a thorough investigation of the total balance system. The systematic investigation, qualitatively or quantitatively, of integration of sensory input and motor outputs provides a dimension that typically has been lacking in the routine "dizzy patient workup" for all ages but especially for elderly patients. Therefore, the application of the postural maintenance theory with the above-described procedures or variations in these procedures appears to have a great deal of clinical relevance in the evaluation of patients with gait and balance disorders. These types of evaluations represent an adjunct or addition to the evaluation of the vestibular system and the vestibulo-ocular reflexes and by no means should be considered a substitute for that traditional evaluation. It is the combination of information that can provide the clinician with a more global picture of the entire balance system and its functional capabilities.
NASA Astrophysics Data System (ADS)
Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David
2016-04-01
Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.
Advancing student nurse knowledge of the biomedical sciences: A mixed methods study.
Craft, Judy; Christensen, Martin; Bakon, Shannon; Wirihana, Lisa
2017-01-01
Nursing students' ability to learn, integrate and apply bioscience knowledge to their clinical practice remains a concern. To evaluate the implementation, influence, and student perspective of a team-teaching workshop to integrate bioscience theory with clinical nursing practice. The team-teaching workshop was offered prior to commencement of the university semester as a refresher course at an Australian university. This study employed a sequential explanatory mixed methods design incorporating both quantitative and qualitative items. An evaluation survey with quantitative and qualitative items and a focus group were employed. The qualitative data were analysed using a thematic approach. The quantitative data was combined with the emergent themes in the qualitative data. Participants were final year nursing students. Nine students attended the workshop. All students completed the evaluation (N=9) and 44.4% (N=4) attended the focus group. The results revealed six themes: (1) lectures are an inadequate teaching strategy for bioscience; (2) teaching strategies which incorporate active learning engage students; (3) the team-teaching workshop provides an effective learning environment; (4) the workshop content should be expanded; (5) pharmacology should relate to bioscience, and bioscience should relate to nursing; and (6) team-teaching was effective in integrating pharmacology with bioscience, and then translating this into nursing practice. Students had felt there was disjointedness between pharmacology and bioscience, and between bioscience and nursing care within their undergraduate studies. The workshop that was based on team-teaching bridged those gaps, utilised active learning strategies and provided an effective learning environment. Team-teaching that employs active learning strategies is an effective approach to assist nursing students to integrate bioscience knowledge into their nursing practice. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hammer, Simone; Uller, Wibke; Manger, Florentine; Fellner, Claudia; Zeman, Florian; Wohlgemuth, Walter A
2017-01-01
Quantitative evaluation of hemodynamic characteristics of arteriovenous and venous malformations using time-resolved magnetic resonance angiography (MRA) at 3.0 Tesla. Time-resolved MRA with interleaved stochastic trajectories (TWIST) at 3.0 Tesla was studied in 83 consecutive patients with venous malformations (VM) and arteriovenous malformations (AVM). Enhancement characteristics were calculated as percentage increase of signal intensity above baseline over time. Maximum percentage signal intensity increase (signal max ), time intervals between onset of arterial enhancement and lesion enhancement (t onset ), and time intervals between beginning of lesion enhancement and maximum percentage of lesion enhancement (t max ) were analyzed. All AVMs showed a high-flow hemodynamic pattern. Two significantly different (p < 0.001) types of venous malformations emerged: VMs with arteriovenous fistulas (AVF) (median signal max 737 %, IQR [interquartile range] = 511 - 1182 %; median t onset 5 s, IQR = 5 - 10 s; median t max 35 s, IQR = 26 - 40 s) and without AVFs (median signal max 284 %, IQR = 177-432 %; median t onset 23 s, IQR = 15 - 30 s; median t max 60 s, IQR = 55 - 75 s). Quantitative evaluation of time-resolved MRA at 3.0 Tesla provides hemodynamic characterization of vascular malformations. VMs can be subclassified into two hemodynamic subgroups due to presence or absence of AVFs. • Time-resolved MRA at 3.0 Tesla provides quantitative hemodynamic characterization of vascular malformations. • Malformations significantly differ in time courses of enhancement and signal intensity increase. • AVMs show a distinctive high-flow hemodynamic pattern. • Two significantly different types of VMs emerged: VMs with and without AVFs.
Aoki, Takatoshi; Yamaguchi, Shinpei; Kinoshita, Shunsuke; Hayashida, Yoshiko; Korogi, Yukunori
2016-09-01
To determine the reproducibility of the quantitative chemical shift-based water-fat separation method with a multiecho gradient echo sequence [iteraterative decomposition of water and fat with echo asymmetry and least-squares estimation quantitation sequence (IDEAL-IQ)] for assessing bone marrow fat fraction (FF); to evaluate variation of FF at different bone sites; and to investigate its association with age and menopause. 31 consecutive females who underwent pelvic iterative decomposition of water and fat with echo asymmetry and least-squares estimation at 3-T MRI were included in this study. Quantitative FF using IDEAL-IQ of four bone sites were analyzed. The coefficients of variance (CV) on each site were evaluated repeatedly 10 times to assess the reproducibility. Correlations between FF and age were evaluated on each site, and the FFs between pre- and post-menopausal groups were compared. The CV in the quantification of marrow FF ranged from 0.69% to 1.70%. A statistically significant correlation was established between the FF and the age in lumbar vertebral body, ilium and intertrochanteric region of the femur (p < 0.001). The average FF of post-menopausal females was significantly higher than that of pre-menopausal females in these sites (p < 0.05). In the greater trochanter of the femur, there was no significant correlation between FF and age. In vivo IDEAL-IQ would provide reliable quantification of bone marrow fat. IDEAL-IQ is simple to perform in a short time and may be practical for providing information on bone quality in clinical settings.
The Evaluator's Perspective: Evaluating the State Capacity Building Program.
ERIC Educational Resources Information Center
Madey, Doren L.
A historical antagonism between the advocates of quantitative evaluation methods and the proponents of qualitative evaluation methods has stymied the recognition of the value to be gained by utilizing both methodologies in the same study. The integration of quantitative and qualitative methods within a single evaluation has synergistic effects in…
Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A
2014-09-01
The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.
Providing web-based mental health services to at-risk women
2011-01-01
Background We examined the feasibility of providing web-based mental health services, including synchronous internet video conferencing of an evidence-based support/education group, to at-risk women, specifically poor lone mothers. The objectives of this study were to: (i) adapt a face-to-face support/education group intervention to a web-based format for lone mothers, and (ii) evaluate lone mothers' response to web-based services, including an online video conferencing group intervention program. Methods Participating mothers were recruited through advertisements. To adapt the face-to-face intervention to a web-based format, we evaluated participant motivation through focus group/key informant interviews (n = 7), adapted the intervention training manual for a web-based environment and provided a computer training manual. To evaluate response to web-based services, we provided the intervention to two groups of lone mothers (n = 15). Pre-post quantitative evaluation of mood, self-esteem, social support and parenting was done. Post intervention follow up interviews explored responses to the group and to using technology to access a health service. Participants received $20 per occasion of data collection. Interviews were taped, transcribed and content analysis was used to code and interpret the data. Adherence to the intervention protocol was evaluated. Results Mothers participating in this project experienced multiple difficulties, including financial and mood problems. We adapted the intervention training manual for use in a web-based group environment and ensured adherence to the intervention protocol based on viewing videoconferencing group sessions and discussion with the leaders. Participant responses to the group intervention included decreased isolation, and increased knowledge and confidence in themselves and their parenting; the responses closely matched those of mothers who obtained same service in face-to-face groups. Pre-and post-group quantitative evaluations did not show significant improvements on measures, although the study was not powered to detect these. Conclusions We demonstrated that an evidence-based group intervention program for lone mothers developed and evaluated in face-to-face context transferred well to an online video conferencing format both in terms of group process and outcomes. PMID:21854563
Providing web-based mental health services to at-risk women.
Lipman, Ellen L; Kenny, Meghan; Marziali, Elsa
2011-08-19
We examined the feasibility of providing web-based mental health services, including synchronous internet video conferencing of an evidence-based support/education group, to at-risk women, specifically poor lone mothers. The objectives of this study were to: (i) adapt a face-to-face support/education group intervention to a web-based format for lone mothers, and (ii) evaluate lone mothers' response to web-based services, including an online video conferencing group intervention program. Participating mothers were recruited through advertisements. To adapt the face-to-face intervention to a web-based format, we evaluated participant motivation through focus group/key informant interviews (n = 7), adapted the intervention training manual for a web-based environment and provided a computer training manual. To evaluate response to web-based services, we provided the intervention to two groups of lone mothers (n = 15). Pre-post quantitative evaluation of mood, self-esteem, social support and parenting was done. Post intervention follow up interviews explored responses to the group and to using technology to access a health service. Participants received $20 per occasion of data collection. Interviews were taped, transcribed and content analysis was used to code and interpret the data. Adherence to the intervention protocol was evaluated. Mothers participating in this project experienced multiple difficulties, including financial and mood problems. We adapted the intervention training manual for use in a web-based group environment and ensured adherence to the intervention protocol based on viewing videoconferencing group sessions and discussion with the leaders. Participant responses to the group intervention included decreased isolation, and increased knowledge and confidence in themselves and their parenting; the responses closely matched those of mothers who obtained same service in face-to-face groups. Pre-and post-group quantitative evaluations did not show significant improvements on measures, although the study was not powered to detect these. We demonstrated that an evidence-based group intervention program for lone mothers developed and evaluated in face-to-face context transferred well to an online video conferencing format both in terms of group process and outcomes.
QUANTITATIVE MAGNETIC RESONANCE IMAGING OF ARTICULAR CARTILAGE AND ITS CLINICAL APPLICATIONS
Li, Xiaojuan; Majumdar, Sharmila
2013-01-01
Cartilage is one of the most essential tissues for healthy joint function and is compromised in degenerative and traumatic joint diseases. There have been tremendous advances during the past decade using quantitative MRI techniques as a non-invasive tool for evaluating cartilage, with a focus on assessing cartilage degeneration during osteoarthritis (OA). In this review, after a brief overview of cartilage composition and degeneration, we discuss techniques that grade and quantify morphologic changes as well as the techniques that quantify changes in the extracellular matrix. The basic principles, in vivo applications, advantages and challenges for each technique are discussed. Recent studies using the OA Initiative (OAI) data are also summarized. Quantitative MRI provides non-invasive measures of cartilage degeneration at the earliest stages of joint degeneration, which is essential for efforts towards prevention and early intervention in OA. PMID:24115571
Development and Implementation of a Design Metric for Systems Containing Long-Term Fluid Loops
NASA Technical Reports Server (NTRS)
Steele, John W.
2016-01-01
John Steele, a chemist and technical fellow from United Technologies Corporation, provided a water quality module to assist engineers and scientists with a metric tool to evaluate risks associated with the design of space systems with fluid loops. This design metric is a methodical, quantitative, lessons-learned based means to evaluate the robustness of a long-term fluid loop system design. The tool was developed by a cross-section of engineering disciplines who had decades of experience and problem resolution.
Danet, Alina; Prieto Rodríguez, María Ángeles; Gamboa Moreno, Estibaliz; Ochoa de Retana Garcia, Lourdes; March Cerdà, Joan Carles
2016-10-01
To evaluate a peer training strategy for patients with type2 diabetes mellitus, developed in two training programmes in the Basque Country and Andalusia. Quantitative pre- and post-intervention and qualitative evaluation, developed between 2012 and 2014. The Basque Country and Andalusia. A total of 409 patients and trainer-patients, participating in self-management peer training programmes. Intentional sample of 44 patients for the qualitative study. Bivariate analysis and net gains for common variables used in questionnaires in the Basque Country and Andalusia: self-reported health, daily activities, physical activity, use of health services, and self-management. Content analysis of 8 focus groups with patients and trainer-patients, including: coding, categorisation, and triangulation of results. Peer training has a positive impact on physical activity, the use of health services, and self-management, with some gender differences. The peer-training strategy is considered positive, as it strengthens the patient-health provider relationship, generates group support and self-confidence, and improves the emotional management. Patients identify two areas of potential improvement: access and continuity of training strategies, and more support and recognition from health providers and institutions. The positive impact on health and quality of life that this patient peer-training provides, requires the collaboration of health professionals and institutions, which should improve the access, continuity and adaptation to patient needs and expectations. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Freed, Melanie; de Zwart, Jacco A.; Hariharan, Prasanna; R. Myers, Matthew; Badano, Aldo
2011-01-01
Purpose: To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. Methods: The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml∕s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. Results: The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml∕s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. Conclusions: The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions. PMID:21992378
Global scaling for semi-quantitative analysis in FP-CIT SPECT.
Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R
2014-01-01
Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.
Chen, Teresa C.
2009-01-01
Purpose: To demonstrate that video-rate spectral domain optical coherence tomography (SDOCT) can qualitatively and quantitatively evaluate optic nerve head (ONH) and retinal nerve fiber layer (RNFL) glaucomatous structural changes. To correlate quantitative SDOCT parameters with disc photography and visual fields. Methods: SDOCT images from 4 glaucoma eyes (4 patients) with varying stages of open-angle glaucoma (ie, early, moderate, late) were qualitatively contrasted with 2 age-matched normal eyes (2 patients). Of 61 other consecutive patients recruited in an institutional setting, 53 eyes (33 patients) met inclusion/exclusion criteria for quantitative studies. Images were obtained using two experimental SDOCT systems, one utilizing a superluminescent diode and the other a titanium:sapphire laser source, with axial resolutions of about 6 μm and 3 μm, respectively. Results: Classic glaucomatous ONH and RNFL structural changes were seen in SDOCT images. An SDOCT reference plane 139 μm above the retinal pigment epithelium yielded cup-disc ratios that best correlated with masked physician disc photography cup-disc ratio assessments. The minimum distance band, a novel SDOCT neuroretinal rim parameter, showed good correlation with physician cup-disc ratio assessments, visual field mean deviation, and pattern standard deviation (P values range, .0003–.024). RNFL and retinal thickness maps correlated well with disc photography and visual field testing. Conclusions: To our knowledge, this thesis presents the first comprehensive qualitative and quantitative evaluation of SDOCT images of the ONH and RNFL in glaucoma. This pilot study provides basis for developing more automated quantitative SDOCT-specific glaucoma algorithms needed for future prospective multicenter national trials. PMID:20126502
QFASAR: Quantitative fatty acid signature analysis with R
Bromaghin, Jeffrey F.
2017-01-01
Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.
NASA Astrophysics Data System (ADS)
Steinberg, Idan; Gannot, Israel; Eyal, Avishay
2015-03-01
Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.
Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.
Maitre, Nathalie L; Chorna, Olena; Romeo, Domenico M; Guzzetta, Andrea
2016-12-01
High-risk infant follow-up programs provide early identification and referral for treatment of neurodevelopmental delays and impairments. In these programs, a standardized neurological examination is a critical component of evaluation for clinical and research purposes. To address primary challenges of provider educational diversity and standardized documentation, we designed an approach to training and implementation of the Hammersmith Infant Neurological Examination with precourse materials, a workshop model, and adaptation of the electronic medical record. Provider completion and documentation of a neurological examination were evaluated before and after Hammersmith Infant Neurological Examination training. Standardized training and implementation of the Hammersmith Infant Neurological Examination in a large high-risk infant follow-up is feasible and effective and allows for quantitative evaluation of neurological findings and developmental trajectories. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Guan, Xiaojun; Huang, Peiyu; Zeng, Qiaoling; Liu, Chunlei; Wei, Hongjiang; Xuan, Min; Gu, Quanquan; Xu, Xiaojun; Wang, Nian; Yu, Xinfeng; Luo, Xiao; Zhang, Minming
2018-02-07
Myelinated white matter showing diamagnetic susceptibility is important for information transfer in the brain. In Parkinson's disease (PD), the white matter is also suffering degenerative alterations. Quantitative susceptibility mapping (QSM) is a novel technique for noninvasive assessment of regional white matter ultrastructure, and provides different information of white matter in addition to standard diffusion tensor imaging (DTI). In this study, we used QSM to detect spatial white matter alterations in PD patients (n = 65) and age- and sex-matched normal controls (n = 46). Voxel-wise tract-based spatial statistics were performed to analyze QSM and DTI data. QSM showed extensive white matter involvement-including regions adjacent to the frontal, parietal, and temporal lobes-in PD patients, which was more widespread than that observed using DTI. Both QSM and DTI showed similar alterations in the left inferior longitudinal fasciculus and right cerebellar hemisphere. Further, alterations in the white matter were correlated with motor impairment and global disease severity in PD patients. We suggest that QSM may provide a novel approach for detecting white matter alterations and underlying network disruptions in PD. Further, the combination of QSM and DTI would provide a more complete evaluation of the diseased brain by analyzing different biological tissue properties.
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
2009-01-01
Quantization of gene expression requires that an accurate measurement of a specific transcript is made. In this paper, a quantitative reverse transcription-polymerase chain reaction (RT-PCR) by competition for tilapia growth hormone receptor type I is designed and validated. This experimental procedure was used to determine the abundance of growth hormone receptor type I transcript in different tilapia tissues. The results obtained with this developed competitive RT-PCR were similar to real-time PCR results reported recently. This protocol provides a reliable alternative, but less expensive than real-time PCR to quantify specific genes. PMID:19495916
ERIC Educational Resources Information Center
Micceri, Ted
2005-01-01
Although numerous quality ratings exist in today's media-centric environment (Money Magazine, and U.S. News and World Report, etc.), it is quite difficult to provide any reasonably meaningful estimates of institutional quality, either qualitative or quantitative. Global ratings of university "quality" abound, despite the fact that there…
ERIC Educational Resources Information Center
Grover-Bisker, Edna M.
2011-01-01
This study examined the relationship between cooperative education (co-op) and the early career outcomes of graduates at Missouri University of Science and Technology, a science and technological research university in the Midwest. The study's primary purpose was to provide university leadership with a quantitative evaluation of the university's…
ERIC Educational Resources Information Center
Lee, Jeong W.
Quantitative financial measures were applied to evaluate the performance of the North Dakota Public Employee Retirement System (NDPERS) pension fund portfolios and the Teachers Insurance and Annuity Association (TIAA)/College Retirement Equities Fund (CREF) portfolios, thus providing a relative performance assessment. Ten years of data were…
A Quantitative Evaluation of the Flipped Classroom in a Large Lecture Principles of Economics Course
ERIC Educational Resources Information Center
Balaban, Rita A.; Gilleskie, Donna B.; Tran, Uyen
2016-01-01
This research provides evidence that the flipped classroom instructional format increases student final exam performance, relative to the traditional instructional format, in a large lecture principles of economics course. The authors find that the flipped classroom directly improves performance by 0.2 to 0.7 standardized deviations, depending on…
ERIC Educational Resources Information Center
Sondergeld, Toni A.; Koskey, Kristin L. K.
2011-01-01
An abundance of comprehensive school reform (CSR) literature exists illustrating CSRs are effective in improving student outcomes. However, much of this research reports on top-down reforms, focuses on academic outcomes, and uses quantitative methods alone. Many educational researchers have argued for the use of mixed methods for providing a…
ERIC Educational Resources Information Center
Yan, Zi; Sendall, Patricia
2016-01-01
While many American colleges and universities are providing a First Year Experience (FYE) course or program for their first year students, those programs are not often customized to take into account international students' (IS) unique challenges. Using quantitative and qualitative methods, this study evaluated a FYE course that was customized for…
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
USDA-ARS?s Scientific Manuscript database
Despite considerable efforts in developing the curve-fitting protocol to evaluate the crystallinity index (CI) from the X-ray diffraction (XRD) measurement, in its present state XRD procedure can only provide a qualitative or semi-quantitative assessment of the amounts of crystalline or amorphous po...
A Meta-Analysis of Interventions for Bereaved Children and Adolescents
ERIC Educational Resources Information Center
Rosner, Rita; Kruse, Joachim; Hagl, Maria
2010-01-01
The main objective of this review was to provide a quantitative and methodologically sound evaluation of existing treatments for bereavement and grief reactions in children and adolescents. Two meta-analyses were conducted: 1 on controlled studies and 1 on uncontrolled studies. The 2 meta-analyses were based on a total of 27 treatment studies…
ERIC Educational Resources Information Center
Whiteman, Rodney S.
2015-01-01
Purpose: Mixed methods research can provide a fruitful line of inquiry for educational leadership, program evaluation, and policy analysis; however, mixed methods research requires a metatheory that allows for mixing what have traditionally been considered incompatible qualitative and quantitative inquiry. The purpose of this paper is to apply…
What's the Big Deal? Collection Evaluation at the National Level
ERIC Educational Resources Information Center
Jurczyk, Eva; Jacobs, Pamela
2014-01-01
This article discusses a project undertaken to assess the journals in a Big Deal package by applying a weighted value algorithm measuring quality, utility, and value of individual titles. Carried out by a national library consortium in Canada, the project confirmed the value of the Big Deal package while providing a quantitative approach for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...
Wu, Shulian; Huang, Yudian; Tang, Qinggong; Li, Zhifang; Horng, Hannah; Li, Jiatian; Wu, Zaihua; Chen, Yu; Li, Hui
2018-03-01
Preoperative neoadjuvant treatment in locally advanced breast cancer is recognized as an effective adjuvant therapy, as it improves treatment outcomes. However, the potential complications remain a threat, so there is an urgent clinical need to assess both the tumor response and changes in its microenvironment using non-invasive and precise identification techniques. Here, two-photon microscopy was employed to detect morphological alterations in breast cancer progression and recession throughout chemotherapy. The changes in structure were analyzed based on the autofluorescence and collagen of differing statuses. Parameters, including optical redox ratio, the ratio of second harmonic generation and auto-fluorescence signal, collagen density, and collagen shape orientation, were studied. Results indicate that these parameters are potential indicators for evaluating breast tumors and their microenvironment changes during progression and chemotherapy. Combined analyses of these parameters could provide a quantitative, novel method for monitoring tumor therapy.
Lewtas, J
1983-01-01
Incomplete combustion of fuel in motor vehicles results in the emission of submicron carbonaceous particles which, after cooling and dilution, contain varying quantities of extractable organic constituents. These organics are mutagenic in bacteria. Confirmatory bioassays in mammalian cells provide the capability of detecting chromosomal and DNA damage in addition to gene mutations. In order to evaluate the mutagenicity of these organics in mammalian cells, extractable organics from particle emissions from several diesel and gasoline vehicles were compared in a battery of microbial, mammalian cell and in vivo bioassays. The mammalian cell mutagenicity bioassays were selected to detect gene mutations, DNA damage, and chromosomal effects. Carcinogenesis bioassays conducted included short-term assays for oncogenic transformation and skin tumorigenesis. The results in different assay systems are compared both qualitatively and quantitatively. Good quantitative correlations were observed between several mutagenesis and carcinogenesis bioassays for this series of diesel and gasoline emissions. PMID:6186475
Estimating sub-surface dispersed oil concentration using acoustic backscatter response.
Fuller, Christopher B; Bonner, James S; Islam, Mohammad S; Page, Cheryl; Ojo, Temitope; Kirkey, William
2013-05-15
The recent Deepwater Horizon disaster resulted in a dispersed oil plume at an approximate depth of 1000 m. Several methods were used to characterize this plume with respect to concentration and spatial extent including surface supported sampling and autonomous underwater vehicles with in situ instrument payloads. Additionally, echo sounders were used to track the plume location, demonstrating the potential for remote detection using acoustic backscatter (ABS). This study evaluated use of an Acoustic Doppler Current Profiler (ADCP) to quantitatively detect oil-droplet suspensions from the ABS response in a controlled laboratory setting. Results from this study showed log-linear ABS responses to oil-droplet volume concentration. However, the inability to reproduce ABS response factors suggests the difficultly in developing meaningful calibration factors for quantitative field analysis. Evaluation of theoretical ABS intensity derived from the particle size distribution provided insight regarding method sensitivity in the presence of interfering ambient particles. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan
A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less
NASA Astrophysics Data System (ADS)
Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong
2018-02-01
Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.
Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun
2011-01-01
Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.
Villa, C A; Finlayson, S; Limpus, C; Gaus, C
2015-04-15
Biomonitoring of blood is commonly used to identify and quantify occupational or environmental exposure to chemical contaminants. Increasingly, this technique has been applied to wildlife contaminant monitoring, including for green turtles, allowing for the non-lethal evaluation of chemical exposure in their nearshore environment. The sources, composition, bioavailability and toxicity of metals in the marine environment are, however, often unknown and influenced by numerous biotic and abiotic factors. These factors can vary considerably across time and space making the selection of the most informative elements for biomonitoring challenging. This study aimed to validate an ICP-MS multi-element screening method for green turtle blood in order to identify and facilitate prioritisation of target metals for subsequent fully quantitative analysis. Multi-element screening provided semiquantitative results for 70 elements, 28 of which were also determined through fully quantitative analysis. Of the 28 comparable elements, 23 of the semiquantitative results had an accuracy between 67% and 112% relative to the fully quantified values. In lieu of any available turtle certified reference materials (CRMs), we evaluated the use of human blood CRMs as a matrix surrogate for quality control, and compared two commonly used sample preparation methods for matrix related effects. The results demonstrate that human blood provides an appropriate matrix for use as a quality control material in the fully quantitative analysis of metals in turtle blood. An example for the application of this screening method is provided by comparing screening results from blood of green turtles foraging in an urban and rural region in Queensland, Australia. Potential targets for future metal biomonitoring in these regions were identified by this approach. Copyright © 2014 Elsevier B.V. All rights reserved.
Schneider, Anna; Weigl, Matthias
2018-01-01
Emergency departments (ED) are complex and dynamic work environments with various psychosocial work stressors that increase risks for providers' well-being. Yet, no systematic review is available which synthesizes the current research base as well as quantitatively aggregates data on associations between ED work factors and provider well-being outcomes. We aimed at synthesizing the current research base on quantitative associations between psychosocial work factors (classified into patient-/ task-related, organizational, and social factors) and mental well-being of ED providers (classified into positive well-being outcomes, affective symptoms and negative psychological functioning, cognitive-behavioural outcomes, and psychosomatic health complaints). A systematic literature search in eight databases was conducted in December 2017. Original studies were extracted following a stepwise procedure and predefined inclusion criteria. A standardized assessment of methodological quality and risk of bias was conducted for each study with the Quality Assessment Tool for Quantitative Studies from the Effective Public Health Practice Project. In addition to a systematic compilation of included studies, frequency and strength of quantitative associations were synthesized by means of harvest plots. Subgroup analyses for ED physicians and nurses were conducted. N = 1956 records were retrieved. After removal of duplicates, 1473 records were screened for titles and abstracts. 199 studies were eligible for full-text review. Finally, 39 original studies were included whereof 37 reported cross-sectional surveys. Concerning the methodological quality of included studies, the majority was evaluated as weak to moderate with considerable risk of bias. Most frequently surveyed provider outcomes were affective symptoms (e.g., burnout) and positive well-being outcomes (e.g., job satisfaction). 367 univariate associations and 370 multivariate associations were extracted with the majority being weak to moderate. Strong associations were mostly reported for social and organizational work factors. To the best of our knowledge, this review is the first to provide a quantitative summary of the research base on associations of psychosocial ED work factors and provider well-being. Conclusive results reveal that peer support, well-designed organizational structures, and employee reward systems balance the negative impact of adverse work factors on ED providers' well-being. This review identifies avenues for future research in this field including methodological advances by using quasi-experimental and prospective designs, representative samples, and adequate confounder control. Protocol registration number: PROSPERO 2016 CRD42016037220.
Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P
2015-01-01
Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.
Methodology for the evaluation of the Stephanie Alexander Kitchen Garden program.
Gibbs, L; Staiger, P K; Townsend, M; Macfarlane, S; Gold, L; Block, K; Johnson, B; Kulas, J; Waters, E
2013-04-01
Community and school cooking and gardening programs have recently increased internationally. However, despite promising indications, there is limited evidence of their effectiveness. This paper presents the evaluation framework and methods negotiated and developed to meet the information needs of all stakeholders for the Stephanie Alexander Kitchen Garden (SAKG) program, a combined cooking and gardening program implemented in selectively funded primary schools across Australia. The evaluation used multiple aligned theoretical frameworks and models, including a public health ecological approach, principles of effective health promotion and models of experiential learning. The evaluation is a non-randomised comparison of six schools receiving the program (intervention) and six comparison schools (all government-funded primary schools) in urban and rural areas of Victoria, Australia. A mixed-methods approach was used, relying on qualitative measures to understand changes in school cultures and the experiential impacts on children, families, teachers, parents and volunteers, and quantitative measures at baseline and 1 year follow up to provide supporting information regarding patterns of change. The evaluation study design addressed the limitations of many existing evaluation studies of cooking or garden programs. The multistrand approach to the mixed methodology maintained the rigour of the respective methods and provided an opportunity to explore complexity in the findings. Limited sensitivity of some of the quantitative measures was identified, as well as the potential for bias in the coding of the open-ended questions. The SAKG evaluation methodology will address the need for appropriate evaluation approaches for school-based kitchen garden programs. It demonstrates the feasibility of a meaningful, comprehensive evaluation of school-based programs and also demonstrates the central role qualitative methods can have in a mixed-method evaluation. So what? This paper contributes to debate about appropriate evaluation approaches to meet the information needs of all stakeholders and will support the sharing of measures and potential comparisons between program outcomes for comparable population groups and settings.
Quantitative analysis of comparative genomic hybridization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manoir, S. du; Bentz, M.; Joos, S.
1995-01-01
Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less
Lu, Min-Hua; Mao, Rui; Lu, Yin; Liu, Zheng; Wang, Tian-Fu; Chen, Si-Ping
2012-01-01
Indentation testing is a widely used approach to evaluate mechanical characteristics of soft tissues quantitatively. Young's modulus of soft tissue can be calculated from the force-deformation data with known tissue thickness and Poisson's ratio using Hayes' equation. Our group previously developed a noncontact indentation system using a water jet as a soft indenter as well as the coupling medium for the propagation of high-frequency ultrasound. The novel system has shown its ability to detect the early degeneration of articular cartilage. However, there is still lack of a quantitative method to extract the intrinsic mechanical properties of soft tissue from water jet indentation. The purpose of this study is to investigate the relationship between the loading-unloading curves and the mechanical properties of soft tissues to provide an imaging technique of tissue mechanical properties. A 3D finite element model of water jet indentation was developed with consideration of finite deformation effect. An improved Hayes' equation has been derived by introducing a new scaling factor which is dependent on Poisson's ratios v, aspect ratio a/h (the radius of the indenter/the thickness of the test tissue), and deformation ratio d/h. With this model, the Young's modulus of soft tissue can be quantitatively evaluated and imaged with the error no more than 2%. PMID:22927890
Quantitative metal magnetic memory reliability modeling for welded joints
NASA Astrophysics Data System (ADS)
Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng
2016-03-01
Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.
Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao
2016-03-18
Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems.
Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao
2016-01-01
Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems. PMID:26999129
Evaluation Criteria for Nursing Student Application of Evidence-Based Practice: A Delphi Study.
Bostwick, Lina; Linden, Lois
2016-06-01
Core clinical evaluation criteria do not exist for measuring prelicensure baccalaureate nursing students' application of evidence-based practice (EBP) during direct care assignments. The study objective was to achieve consensus among EBP nursing experts to create clinical criteria for faculty to use in evaluating students' application of EBP principles. A three-round Delphi method was used. Experts were invited to participate in Web-based surveys. Data were analyzed using qualitative coding and categorizing. Quantitative analyses were descriptive calculations for rating and ranking. Expert consensus occurred in the Delphi rounds. The study provides a set of 10 core clinical evaluation criteria for faculty evaluating students' progression toward competency in their application of EBP. A baccalaureate program curriculum requiring the use of Bostwick's EBP Core Clinical Evaluation Criteria will provide a clear definition for understanding basic core EBP competence as expected for the assessment of student learning. [J Nurs Educ. 2016;55(5):336-341.]. Copyright 2016, SLACK Incorporated.
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
Williams, Katherine M; Kirsh, Susan; Aron, David; Au, David; Helfrich, Christian; Lambert-Kerzner, Anne; Lowery, Julie; Battaglia, Catherine; Graham, Glenn D; Doukas, Michael; Jain, Rajiv; Ho, P Michael
2017-07-01
Veteran's Affairs Office of Specialty Care (OSC) launched four national initiatives (Electronic-Consults [e-Consults], Specialty Care Access Networks-Extension for Community Healthcare Outcomes [SCAN-ECHO], Mini-Residencies, and Specialty Care Neighborhood) to improve specialty care delivery and funded a center to evaluate the initiatives. The evaluation, guided by two implementation frameworks, provides formative (administrator/provider interviews and surveys) and summative data (quantitative data on patterns of use) about the initiatives to OSC. Evaluation of initiative implementation is assessed through CFIR (Consolidated Framework for Implementation Research)-grounded qualitative interviews to identify barriers/facilitators. Depending on high or low implementation, factors such as receiving workload credit, protected time, existing workflow/systems compatibility, leadership engagement, and access to information/resources were considered implementation barriers or facilitators. Findings were shared with OSC and used to further refine implementation at additional sites. Evaluation of other initiatives is ongoing. The mixed-methods approach has provided timely information to OSC about initiative effect and impacted OSC policies on implementation at additional sites.
Quantification, validation, and follow-up of small bowel motility in Crohn's disease
NASA Astrophysics Data System (ADS)
Cerrolaza, Juan J.; Peng, Jennifer Q.; Safdar, Nabile M.; Conklin, Laurie; Sze, Raymond; Linguraru, Marius George
2015-03-01
The use of magnetic resonance enterography (MRE) has become a mainstay in the evaluation, assessment and follow up of inflammatory bowel diseases, such as Crohn's disease (CD), thanks to its high image quality and its non-ionizing nature. In particular, the advent of faster MRE sequences less sensitive to image-motion artifacts offers the possibility to obtain visual, structural and functional information of the patient's small bowel. However, the inherent subjectivity of the mere visual inspection of these images often hinders the accurate identification and monitoring of the pathological areas. In this paper, we present a framework that provides quantitative and objective motility information of the small bowel from free-breathing MRE dynamic sequences. After compensating for the breathing motion of the patient, we create personalized peristaltic activity maps via optical flow analysis. The result is the creation of a new set of images providing objective and precise functional information of the small bowel. The accuracy of the new method was also evaluated from two different perspectives: objective accuracy (1.1 ± 0.6 mm/s of error), i.e., the ability of the system to provide quantitative and accurate information about the motility of moving bowel landmarks, and subjective accuracy (avg. difference of 0.7 ± 0.7 in a range of 1 to 5), i.e., the degree of agreement with the subjective evaluation of an expert. Finally, the practical utility of the new method was successfully evaluated in a preliminary study with 32 studies of healthy and CD cases, showing its potential for the fast and accurate assessment and follow up of CD in the small bowel.
Reconstruction of the dynamics of the climatic system from time-series data
Nicolis, C.; Nicolis, G.
1986-01-01
The oxygen isotope record of the last million years, as provided by a deep sea core sediment, is analyzed by a method recently developed in the theory of dynamical systems. The analysis suggests that climatic variability is the manifestation of a chaotic dynamics described by an attractor of fractal dimensionality. A quantitative measure of the limited predictability of the climatic system is provided by the evaluation of the time-correlation function and the largest positive Lyapounov exponent of the system. PMID:16593650
Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter
2012-04-01
Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.
Zhao, Dan; Liu, Wei; Cai, Ailu; Li, Jingyu; Chen, Lizhu; Wang, Bing
2013-02-01
The purpose of this study was to investigate the effectiveness for quantitative evaluation of cerebellar vermis using three-dimensional (3D) ultrasound and to establish a nomogram for Chinese fetal vermis measurements during gestation. Sonographic examinations were performed in normal fetuses and in cases suspected of the diagnosis of vermian rotation. 3D median planes were obtained with both OMNIVIEW and tomographic ultrasound imaging. Measurements of the cerebellar vermis were highly correlated between two-dimensional and 3D median planes. The diameter of the cerebellar vermis follows growth approximately predicted by the quadratic regression equation. The normal vermis was almost parallel to the brain stem, with the average angle degree to be <2° in normal fetuses. The average angle degree of the 9 cases of vermian rotation was >5°. Three-dimensional median planes are obtained more easily than two-dimensional ones, and allow accurate measurements of the cerebellar vermis. The 3D approach may enable rapid assessment of fetal cerebral anatomy in standard examination. Measurements of cerebellar vermis may provide a quantitative index for prenatal diagnosis of posterior fossa malformations. © 2012 John Wiley & Sons, Ltd.
Manlove, Kezia; Stanley, Laura; Peck, Alyssa
2015-10-01
Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Conti, Andrea A
2008-11-01
The study of the use of English for medicine has become a continual source of enquiry. Aim of this survey was the systematic evaluation of the qualitative and quantitative perception, translation and current use of English terms on the part of Italian health operators. Eight English terms directly connected with the health scenario or related to it compliance", "imaging", "likelihood", "odds ratio", "outcome", "stent", "test", "trial") were selected and, by means of a paper registration form, they were administered to forty Italian health professionals (non-physicians), already active in the health sector and attending specialised health degree courses. The participants were asked to furnish up to two translational proposals for every single English term, and, after the written registration, there followed a structured oral discussion of the translation, perception and everyday use of the English terms in the working reality of the participants. This survey provides a scientific "real world" experience, and its qualitative and quantitative findings are of use in evaluating the level of correction in the adoption of English language on the part of health operators.
Yoshida, Tomokatsu; Yasuda, Rei; Mizuta, Ikuko; Nakagawa, Masanori; Mizuno, Toshiki
2017-01-01
Brain MRI in adult patients with Alexander disease (AxD) mainly shows atrophy in the medulla oblongata. However, currently there is no quantitative standard for assessing this atrophy. In this study, we quantitatively evaluated the brain stem of AxD patients with glial fibrillary acidic protein (GFAP) mutation using conventional MRI to evaluate its usefulness as an aid to diagnosing AxD in daily clinical practice. Nineteen AxD patients with GFAP mutation were compared with 14 patients negative for GFAP mutation in whom AxD was suspected due to "atrophy of the medulla oblongata." In the GFAP mutation-positive group, the sagittal diameter of the medulla oblongata, the ratio of the diameter of the medulla oblongata to that of the midbrain (MO/MB), and the ratio of the sagittal diameter of the medulla oblongata to that of the pons (MO/Po) were significantly smaller compared to those of the GFAP mutation-negative group (p < 0.01). The sensitivity and specificity of each parameter were 87.5 and 92.3%, 91.7 and 81.3%, and 88.2 and 100% with a sagittal diameter of the medulla oblongata <9.0 mm, MO/MB <0.60, and sagittal MO/Po <0.46, respectively. These parameters can provide very useful information to differentially diagnose AxD from other disorders associated with brain stem atrophy in adult patients. © 2017 S. Karger AG, Basel.
[Classical and molecular methods for identification and quantification of domestic moulds].
Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S
2017-12-01
To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Evaluation of board performance in Iran's universities of medical sciences.
Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad
2014-10-01
The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Participants believed that the boards had no acceptable performance for a long time.RESULTS also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards' resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process.
Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming
2018-03-01
Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.
NASA Astrophysics Data System (ADS)
Boehm, H. F.; Fink, C.; Becker, C.; Reiser, M.
2007-03-01
Reliable and accurate methods for objective quantitative assessment of parenchymal alterations in the lung are necessary for diagnosis, treatment and follow-up of pulmonary diseases. Two major types of alterations are pulmonary emphysema and fibrosis, emphysema being characterized by abnormal enlargement of the air spaces distal to the terminal, nonrespiratory bronchiole, accompanied by destructive changes of the alveolar walls. The main characteristic of fibrosis is coursening of the interstitial fibers and compaction of the pulmonary tissue. With the ability to display anatomy free from superimposing structures and greater visual clarity, Multi-Detector-CT has shown to be more sensitive than the chest radiograph in identifying alterations of lung parenchyma. In automated evaluation of pulmonary CT-scans, quantitative image processing techniques are applied for objective evaluation of the data. A number of methods have been proposed in the past, most of which utilize simple densitometric tissue features based on the mean X-ray attenuation coefficients expressed in terms of Hounsfield Units [HU]. Due to partial volume effects, most of the density-based methodologies tend to fail, namely in cases, where emphysema and fibrosis occur within narrow spatial limits. In this study, we propose a methodology based upon the topological assessment of graylevel distribution in the 3D image data of lung tissue which provides a way of improving quantitative CT evaluation. Results are compared to the more established density-based methods.
Evaluation of board performance in Iran’s universities of medical sciences
Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad
2014-01-01
Background: The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. Methods: The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Results: Participants believed that the boards had no acceptable performance for a long time.Results also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards’ resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Conclusion: Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process. PMID:25337597
Laparoscopic training using a quantitative assessment and instructional system.
Yamaguchi, T; Nakamura, R
2018-04-28
Laparoscopic surgery requires complex surgical skills; hence, surgeons require regular training to improve their surgical techniques. The quantitative assessment of a surgeon's skills and the provision of feedback are important processes for conducting effective training. The aim of this study was to develop an inexpensive training system that provides automatic technique evaluation and feedback. We detected the instrument using image processing of commercial web camera images and calculated the motion analysis parameters (MAPs) of the instrument to quantify performance features. Upon receiving the results, we developed a method of evaluating the surgeon's skill level. The feedback system was developed using MAPs-based radar charts and scores for determining the skill level. These methods were evaluated using the videos of 38 surgeons performing a suturing task. There were significant differences in MAPs among surgeons; therefore, MAPs can be effectively used to quantify a surgeon's performance features. The results of skill evaluation and feedback differed greatly between skilled and unskilled surgeons, and it was possible to indicate points of improvement for the procedure performed in this study. Furthermore, the results obtained for certain novice surgeons were similar to those obtained for skilled surgeons. This system can be used to assess the skill level of surgeons, independent of the years of experience, and provide an understanding of the individual's current surgical skill level effectively. We conclude that our system is useful as an inexpensive laparoscopic training system that might aid in skill improvement.
Procedures to evaluate the efficiency of protective clothing worn by operators applying pesticide.
Espanhol-Soares, Melina; Nociti, Leticia A S; Machado-Neto, Joaquim Gonçalves
2013-10-01
The evaluation of the efficiency of whole-body protective clothing against pesticides has already been carried out through field tests and procedures defined by international standards, but there is a need to determine the useful life of these garments to ensure worker safety. The aim of this article is to compare the procedures for evaluating efficiency of two whole-body protective garments, both new and previously used by applicators of herbicides, using a laboratory test with a mannequin and in the field with the operator. The evaluation of the efficiency of protective clothing used both quantitative and qualitative methodologies, leading to a proposal for classification according to efficiency, and determination of the useful life of protective clothing for use against pesticides, based on a quantitative assessment. The procedures used were in accordance with the standards of the modified American Society for Testing and Materials (ASTM) F 1359:2007 and International Organization for Standardization 17491-4. The protocol used in the field was World Health Organization Vector Biology and Control (VBC)/82.1. Clothing tested was personal water repellent and pesticide protective. Two varieties of fabric were tested: Beige (100% cotton) and Camouflaged (31% polyester and 69% cotton). The efficiency in exposure control of the personal protective clothing was measured before use and after 5, 10, 20, and 30 uses and washes under field conditions. Personal protective clothing was worn by workers in the field during the application of the herbicide glyphosate on weed species in mature sugar cane plantations using a knapsack sprayer. The modified ASTM 1359:2007 procedure was chosen as the most appropriate due to its greater repeatability (lower coefficient of variation). This procedure provides quantitative evaluation needed to determine the efficiency and useful life of individual protective clothing, not just at specific points of failure, but according to dermal protection as a whole. The qualitative assessment, which is suitable for verification of garment design and stitching flaws, does not aid in determining useful life, but does complement the quantitative evaluation. The proposed classification is appropriate and accurate for determining the useful life of personal protective clothing against pesticide materials relative to number of uses and washes after each use. For example, the Beige garment had a useful life of 30 uses and washes, while the Camouflaged garment had a useful life of 5 uses and washes. The quantitative evaluation aids in determining the efficiency and useful life of individual protective clothing according to dermal protection as a whole, not just at specific points of failure.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-23
... has developed a proprietary SectorSAM \\TM\\ quantitative research and evaluation process that forecasts... and short portfolios as dictated by its proprietary SectorSAM quantitative research and evaluation... a proprietary quantitative analysis, to forecast each sector's excess return within a specific time...
Fisher, Elliott S; Shortell, Stephen M; Kreindler, Sara A; Van Citters, Aricca D; Larson, Bridget K
2012-11-01
The implementation of accountable care organizations (ACOs), a new health care payment and delivery model designed to improve care and lower costs, is proceeding rapidly. We build on our experience tracking early ACOs to identify the major factors-such as contract characteristics; structure, capabilities, and activities; and local context-that would be likely to influence ACO formation, implementation, and performance. We then propose how an ACO evaluation program could be structured to guide policy makers and payers in improving the design of ACO contracts, while providing insights for providers on approaches to care transformation that are most likely to be successful in different contexts. We also propose key activities to support evaluation of ACOs in the near term, including tracking their formation, developing a set of performance measures across all ACOs and payers, aggregating those performance data, conducting qualitative and quantitative research, and coordinating different evaluation activities.
Quantitative evaluation methods of skin condition based on texture feature parameters.
Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing
2017-03-01
In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.
The Numbers Tell It All: Students Don't Like Numbers!
Uttl, Bob; White, Carmela A.; Morin, Alain
2013-01-01
Undergraduate Students' interest in taking quantitative vs. non quantitative courses has received limited attention even though it has important consequences for higher education. Previous studies have collected course interest ratings at the end of the courses as part of student evaluation of teaching (SET) ratings, which may confound prior interest in taking these courses with students' actual experience in taking them. This study is the first to examine undergraduate students' interest in quantitative vs. non quantitative courses in their first year of studies before they have taken any quantitative courses. Three hundred and forty students were presented with descriptions of 44 psychology courses and asked to rate their interest in taking each course. Student interest in taking quantitative vs non quantitative courses was very low; the mean interest in statistics courses was nearly 6 SDs below the mean interest in non quantitative courses. Moreover, women were less interested in taking quantitative courses than men. Our findings have several far-reaching implications. First, evaluating professors teaching quantitative vs. non quantitative courses against the same SET standard may be inappropriate. Second, if the same SET standard is used for the evaluation of faculty teaching quantitative vs. non quantitative courses, faculty are likely to teach to SETs rather than focus on student learning. Third, universities interested primarily in student satisfaction may want to expunge quantitative courses from their curricula. In contrast, universities interested in student learning may want to abandon SETs as a primary measure of faculty teaching effectiveness. Fourth, undergraduate students who are not interested in taking quantitative courses are unlikely to pursue graduate studies in quantitative psychology and unlikely to be able to competently analyze data independently. PMID:24358284
NASA Astrophysics Data System (ADS)
shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu
2017-11-01
The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Rota, Cristina; Biondi, Marco; Trenti, Tommaso
2011-09-26
Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.
Microstencils to generate defined, multi-species patterns of bacteria
Timm, Collin M.; Hansen, Ryan R.; Doktycz, Mitchel J.; ...
2015-11-12
Microbial communities are complex heterogeneous systems that are influenced by physical and chemical interactions with their environment, host, and community members. Techniques that facilitate the quantitative evaluation of how microscale organization influences the morphogenesis of multispecies communities could provide valuable insights into the dynamic behavior and organization of natural communities, the design of synthetic environments for multispecies culture, and the engineering of artificial consortia. In this work, we demonstrate a method for patterning microbes into simple arrangements that allow the quantitative measurement of growth dynamics as a function of their proximity to one another. The method combines parylene-based liftoff techniquesmore » with microfluidic delivery to simultaneously pattern multiple bacterial species with high viability using low-cost, customizable methods. Furthermore, quantitative measurements of bacterial growth for two competing isolates demonstrate that spatial coordination can play a critical role in multispecies growth and structure.« less
Application of remote sensing to monitoring and studying dispersion in ocean dumping
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Ohlhorst, C. W.
1981-01-01
Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.
Microstencils to generate defined, multi-species patterns of bacteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, Collin M.; Hansen, Ryan R.; Doktycz, Mitchel J.
Microbial communities are complex heterogeneous systems that are influenced by physical and chemical interactions with their environment, host, and community members. Techniques that facilitate the quantitative evaluation of how microscale organization influences the morphogenesis of multispecies communities could provide valuable insights into the dynamic behavior and organization of natural communities, the design of synthetic environments for multispecies culture, and the engineering of artificial consortia. In this work, we demonstrate a method for patterning microbes into simple arrangements that allow the quantitative measurement of growth dynamics as a function of their proximity to one another. The method combines parylene-based liftoff techniquesmore » with microfluidic delivery to simultaneously pattern multiple bacterial species with high viability using low-cost, customizable methods. Furthermore, quantitative measurements of bacterial growth for two competing isolates demonstrate that spatial coordination can play a critical role in multispecies growth and structure.« less
Woskie, Susan R; Bello, Dhimiter; Gore, Rebecca J; Stowe, Meredith H; Eisen, Ellen A; Liu, Youcheng; Sparer, Judy A; Redlich, Carrie A; Cullen, Mark R
2008-09-01
Because many occupational epidemiologic studies use exposure surrogates rather than quantitative exposure metrics, the UMass Lowell and Yale study of autobody shop workers provided an opportunity to evaluate the relative utility of surrogates and quantitative exposure metrics in an exposure response analysis of cross-week change in respiratory function. A task-based exposure assessment was used to develop several metrics of inhalation exposure to isocyanates. The metrics included the surrogates, job title, counts of spray painting events during the day, counts of spray and bystander exposure events, and a quantitative exposure metric that incorporated exposure determinant models based on task sampling and a personal workplace protection factor for respirator use, combined with a daily task checklist. The result of the quantitative exposure algorithm was an estimate of the daily time-weighted average respirator-corrected total NCO exposure (microg/m(3)). In general, these four metrics were found to be variable in agreement using measures such as weighted kappa and Spearman correlation. A logistic model for 10% drop in FEV(1) from Monday morning to Thursday morning was used to evaluate the utility of each exposure metric. The quantitative exposure metric was the most favorable, producing the best model fit, as well as the greatest strength and magnitude of association. This finding supports the reports of others that reducing exposure misclassification can improve risk estimates that otherwise would be biased toward the null. Although detailed and quantitative exposure assessment can be more time consuming and costly, it can improve exposure-disease evaluations and is more useful for risk assessment purposes. The task-based exposure modeling method successfully produced estimates of daily time-weighted average exposures in the complex and changing autobody shop work environment. The ambient TWA exposures of all of the office workers and technicians and 57% of the painters were found to be below the current U.K. Health and Safety Executive occupational exposure limit (OEL) for total NCO of 20 microg/m(3). When respirator use was incorporated, all personal daily exposures were below the U.K. OEL.
Diffusion Lung Imaging with Hyperpolarized Gas MRI
Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Quirk, James D
2015-01-01
Lung imaging using conventional 1H MRI presents great challenges due to low density of lung tissue, lung motion and very fast lung tissue transverse relaxation (typical T2* is about 1-2 ms). MRI with hyperpolarized gases (3He and 129Xe) provides a valuable alternative due to a very strong signal originated from inhaled gas residing in the lung airspaces and relatively slow gas T2* relaxation (typical T2* is about 20-30 ms). Though in vivo human experiments should be done very fast – usually during a single breath-hold. In this review we describe the recent developments in diffusion lung MRI with hyperpolarized gases. We show that a combination of modeling results of gas diffusion in lung airspaces and diffusion measurements with variable diffusion-sensitizing gradients allows extracting quantitative information on the lung microstructure at the alveolar level. This approach, called in vivo lung morphometry, allows from a less than 15-second MRI scan, providing quantitative values and spatial distributions of the same physiological parameters as are measured by means of the “standard” invasive stereology (mean linear intercept, surface-to-volume ratio, density of alveoli, etc.). Besides, the approach makes it possible to evaluate some advanced Weibel parameters characterizing lung microstructure - average radii of alveolar sacs and ducts, as well as the depth of their alveolar sleeves. Such measurements, providing in vivo information on the integrity of pulmonary acinar airways and their changes in different diseases, are of great importance and interest to a broad range of physiologists and clinicians. We also discuss a new type of experiments that are based on the in vivo lung morphometry technique combined with quantitative CT measurements as well as with the Gradient Echo MRI measurements of hyperpolarized gas transverse relaxation in the lung airspaces. Such experiments provide additional information on the blood vessel volume fraction, specific gas volume, the length of acinar airways, and allows evaluation of lung parenchymal and non-parenchymal tissue. PMID:26676342
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... gear, which consists of large nets towed through the water by the vessel. At times, Chinook salmon and... vessel does its best to avoid Chinook salmon at all times while fishing for pollock and that collectively... provide a qualitative evaluation and some quantitative information on the effectiveness of the IPAs. Each...
ERIC Educational Resources Information Center
De La Rue, Lisa; Polanin, Joshua R.; Espelage, Dorothy L.; Pigott, Terri D.
2017-01-01
The incidence of violence in dating relationships has a significant impact on young people, including decreased mental and physical health. This review is the first to provide a quantitative synthesis of empirical evaluations of school-based programs implemented in middle and high schools that sought to prevent or reduce incidents of dating…
ERIC Educational Resources Information Center
Echon, Roger M.
2014-01-01
Purpose/Objectives: The purpose of this paper is to provide baseline data and characteristics of food served and consumed prior to the recently mandated nutrition standards as authorized by the Healthy, Hunger-Free Kids Act of 2010 (HHFKA). Methods: Over 600,000 school lunch menus with associated food production records from 61 elementary schools…
ERIC Educational Resources Information Center
Guardino, Caroline
2015-01-01
A national survey was conducted to determine the needs of teachers and service providers working with students who are deaf and hard of hearing with disabilities (DWD). Quantitative and qualitative questions were asked regarding knowledge of, training with, and strategies used with students who are DWD. Responses from 264 professionals working…
Assessment of umbilical artery flow and fetal heart rate to predict delivery time in bitches.
Giannico, Amália Turner; Garcia, Daniela Aparecida Ayres; Gil, Elaine Mayumi Ueno; Sousa, Marlos Gonçalves; Froes, Tilde Rodrigues
2016-10-15
The aim of this study was to quantitatively investigate the oscillation of the fetal heart rate (HR) in advance of normal delivery and whether this index could be used to indicate impending delivery. In addition, fetal HR oscillation and umbilical artery resistive index (RI) were correlated to determine if the combination of these parameters provided a more accurate prediction of the time of delivery. Sonographic evaluation was performed in 11 pregnant bitches to evaluate the fetal HR and umbilical artery RI at the following antepartum times: 120 to 96 hours, 72 to 48 hours, 24 to 12 hours, and 12 to 1 hours. Statistical analysis indicated a correlation between the oscillation of fetal HR and the umbilical artery RI. As delivery approached a considerable reduction in the umbilical artery RI was documented and greater oscillations between maximum and minimum HRs occurred. We conclude that the quantitative analysis of fetal HR oscillations may be used to predict the time of delivery in bitches. The combination of fetal HR and umbilical artery RI together may provide more accurate predictions of time of delivery. Copyright © 2016 Elsevier Inc. All rights reserved.
Delmore, Kira E; Liedvogel, Miriam
2016-01-01
The amazing accuracy of migratory orientation performance across the animal kingdom is facilitated by the use of magnetic and celestial compass systems that provide individuals with both directional and positional information. Quantitative genetics analyses in several animal systems suggests that migratory orientation has a strong genetic component. Nevertheless, the exact identity of genes controlling orientation remains largely unknown, making it difficult to obtain an accurate understanding of this fascinating behavior on the molecular level. Here, we provide an overview of molecular genetic techniques employed thus far, highlight the pros and cons of various approaches, generalize results from species-specific studies whenever possible, and evaluate how far the field has come since early quantitative genetics studies. We emphasize the importance of examining different levels of molecular control, and outline how future studies can take advantage of high-resolution tracking and sequencing techniques to characterize the genomic architecture of migratory orientation.
Griffey, Susan; Piccinino, Linda; Gallivan, Joanne; Lotenberg, Lynne Doner; Tuncer, Diane
2015-02-01
Since the 1970s, the federal government has spearheaded major national education programs to reduce the burden of chronic diseases in the United States. These prevention and disease management programs communicate critical information to the public, those affected by the disease, and health care providers. The National Diabetes Education Program (NDEP), the leading federal program on diabetes sponsored by the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC), uses primary and secondary quantitative data and qualitative audience research to guide program planning and evaluation. Since 2006, the NDEP has filled the gaps in existing quantitative data sources by conducting its own population-based survey, the NDEP National Diabetes Survey (NNDS). The NNDS is conducted every 2–3 years and tracks changes in knowledge, attitudes and practice indicators in key target audiences. This article describes how the NDEP has used the NNDS as a key component of its evaluation framework and how it applies the survey results for strategic planning and program improvement. The NDEP's use of the NNDS illustrates how a program evaluation framework that includes periodic population-based surveys can serve as an evaluation model for similar national health education programs.
Quantitative Decision Support Requires Quantitative User Guidance
NASA Astrophysics Data System (ADS)
Smith, L. A.
2009-12-01
Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output for a given problem is presented. Based on climate science, meteorology, and the details of the question in hand, this approach identifies necessary (never sufficient) conditions required for the rational use of climate model output in quantitative decision support tools. Inasmuch as climate forecasting is a problem of extrapolation, there will always be harsh limits on our ability to establish where a model is fit for purpose, this does not, however, limit us from identifying model noise as such, and thereby avoiding some cases of the misapplication and over interpretation of model output. It is suggested that failure to clearly communicate the limits of today’s climate model in providing quantitative decision relevant climate information to today’s users of climate information, would risk the credibility of tomorrow’s climate science and science based policy more generally.
A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation
NASA Astrophysics Data System (ADS)
Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis
2011-06-01
This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.
Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.
Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro
2016-03-01
Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.
NASA Astrophysics Data System (ADS)
Wei, Jun; Zhou, Chuan; Chan, Heang-Ping; Chughtai, Aamer; Agarwal, Prachi; Kuriakose, Jean; Hadjiiski, Lubomir; Patel, Smita; Kazerooni, Ella
2015-03-01
We are developing a computer-aided detection system to assist radiologists in detection of non-calcified plaques (NCPs) in coronary CT angiograms (cCTA). In this study, we performed quantitative analysis of arterial flow properties in each vessel branch and extracted flow information to differentiate the presence and absence of stenosis in a vessel segment. Under rest conditions, blood flow in a single vessel branch was assumed to follow Poiseuille's law. For a uniform pressure distribution, two quantitative flow features, the normalized arterial compliance per unit length (Cu) and the normalized volumetric flow (Q) along the vessel centerline, were calculated based on the parabolic Poiseuille solution. The flow features were evaluated for a two-class classification task to differentiate NCP candidates obtained by prescreening as true NCPs and false positives (FPs) in cCTA. For evaluation, a data set of 83 cCTA scans was retrospectively collected from 83 patient files with IRB approval. A total of 118 NCPs were identified by experienced cardiothoracic radiologists. The correlation between the two flow features was 0.32. The discriminatory ability of the flow features evaluated as the area under the ROC curve (AUC) was 0.65 for Cu and 0.63 for Q in comparison with AUCs of 0.56-0.69 from our previous luminal features. With stepwise LDA feature selection, volumetric flow (Q) was selected in addition to three other luminal features. With FROC analysis, the test results indicated a reduction of the FP rates to 3.14, 1.98, and 1.32 FPs/scan at sensitivities of 90%, 80%, and 70%, respectively. The study indicated that quantitative blood flow analysis has the potential to provide useful features for the detection of NCPs in cCTA.
Stockwell-Smith, G; Moyle, W; Kellett, U
2018-05-13
To evaluate the effect of a targeted community-based psychosocial intervention on self-efficacy outcomes for care recipient/carer dyads living with early-stage dementia. There is increasing interest in the role of self-efficacy and self-management structures in determining positive outcomes for people with dementia. The assumption is that care recipient/carer dyads who receive early support to identify and adjust to dementia related changes will cope better in the long-term. An explanatory sequential mixed-method design was employed. Primarily quantitative with qualitative data providing a supportive secondary role to expand on and illuminate the quantitative findings. 88 dyads were recruited and allocated on a regional basis to an intervention or control group. Intervention group dyads received the Early Diagnosis Dyadic Intervention. Control group dyads received two information manuals. Quantitative data were collected at three time-points. Qualitative data were collected via evaluation questionnaires and semi-structured interviews. Intervention structure, content and delivery were acceptable to the dyads but few quantitative self-efficacy findings reached statistical significance. Improvements in self-efficacy were evident in the post-intervention evaluation qualitative responses where dyads expressed greater confidence in identifying and accessing community support. There is an urgent need for effective psychosocial interventions to help reduce the impact of dementia symptoms on patients, carers and society. This study makes an important contribution to our understanding of the capacity of psychosocial interventions to improve self-efficacy outcomes for care recipient/carer dyads with early-stage dementia while also illustrating the challenges associated with measuring self-efficacy in the early stages of the condition. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Retinal status analysis method based on feature extraction and quantitative grading in OCT images.
Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri
2016-07-22
Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.
Goel, Utsav O; Maddox, Michael M; Elfer, Katherine N; Dorsey, Philip J; Wang, Mei; McCaslin, Ian Ross; Brown, J Quincy; Lee, Benjamin R
2014-01-01
Reduction of warm ischemia time during partial nephrectomy (PN) is critical to minimizing ischemic damage and improving postoperative kidney function, while maintaining tumor resection efficacy. Recently, methods for localizing the effects of warm ischemia to the region of the tumor via selective clamping of higher-order segmental artery branches have been shown to have superior outcomes compared with clamping the main renal artery. However, artery identification can prolong operative time and increase the blood loss and reduce the positive effects of selective ischemia. Quantitative diffuse reflectance spectroscopy (DRS) can provide a convenient, real-time means to aid in artery identification during laparoscopic PN. The feasibility of quantitative DRS for real-time longitudinal measurement of tissue perfusion and vascular oxygenation in laparoscopic nephrectomy was investigated in vivo in six Yorkshire swine kidneys (n=three animals ). DRS allowed for rapid identification of ischemic areas after selective vessel occlusion. In addition, the rates of ischemia induction and recovery were compared for main renal artery versus tertiary segmental artery occlusion, and it was found that the tertiary segmental artery occlusion trends toward faster recovery after ischemia, which suggests a potential benefit of selective ischemia. Quantitative DRS could provide a convenient and fast tool for artery identification and evaluation of the depth, spatial extent, and duration of selective tissue ischemia in laparoscopic PN.
NASA Astrophysics Data System (ADS)
Goel, Utsav O.; Maddox, Michael M.; Elfer, Katherine N.; Dorsey, Philip J.; Wang, Mei; McCaslin, Ian Ross; Brown, J. Quincy; Lee, Benjamin R.
2014-10-01
Reduction of warm ischemia time during partial nephrectomy (PN) is critical to minimizing ischemic damage and improving postoperative kidney function, while maintaining tumor resection efficacy. Recently, methods for localizing the effects of warm ischemia to the region of the tumor via selective clamping of higher-order segmental artery branches have been shown to have superior outcomes compared with clamping the main renal artery. However, artery identification can prolong operative time and increase the blood loss and reduce the positive effects of selective ischemia. Quantitative diffuse reflectance spectroscopy (DRS) can provide a convenient, real-time means to aid in artery identification during laparoscopic PN. The feasibility of quantitative DRS for real-time longitudinal measurement of tissue perfusion and vascular oxygenation in laparoscopic nephrectomy was investigated in vivo in six Yorkshire swine kidneys (n=three animals). DRS allowed for rapid identification of ischemic areas after selective vessel occlusion. In addition, the rates of ischemia induction and recovery were compared for main renal artery versus tertiary segmental artery occlusion, and it was found that the tertiary segmental artery occlusion trends toward faster recovery after ischemia, which suggests a potential benefit of selective ischemia. Quantitative DRS could provide a convenient and fast tool for artery identification and evaluation of the depth, spatial extent, and duration of selective tissue ischemia in laparoscopic PN.
Correcting power and p-value calculations for bias in diffusion tensor imaging.
Lauzon, Carolyn B; Landman, Bennett A
2013-07-01
Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.
An Integrated MRI and MRS Approach to Evaluation of Multiple Sclerosis with Cognitive Impairment
NASA Astrophysics Data System (ADS)
Liang, Zhengrong; Li, Lihong; Lu, Hongbing; Huang, Wei; Tudorica, Alina; Krupp, Lauren
Magnetic resonance imaging and spectroscopy (MRI/MRS) plays a unique role in multiple sclerosis (MS) evaluation, because of its ability to provide both high image contrast and significant chemical change among brain tissues. The image contrast renders the possibility of quantifying the tissue volumetric and texture variations, e.g., cerebral atrophy and progressing speed, reflecting the ongoing destructive pathologic processes. Any chemical change reflects an early sign of pathological alteration, e.g., decreased N-acetyl aspartate (NAA) in lesions and normal appearing white matter, related to axonal damage or dysfunction. Both MRI and MRS encounter partial volume (PV) effect, which compromises the quantitative capability, especially for MRS. This work aims to develop a statistical framework to segment the tissue mixtures inside each image element, eliminating theoretically the PV effect, and apply the framework to the evaluation of MS with cognitive impairment. The quantitative measures from MRI/MRS neuroimaging are strongly correlated with the qualitative neuropsychological scores of Brief Repeatable Battery (BRB) test on cognitive impairment, demonstrating the usefulness of the PV image segmentation framework in this clinically significant problem.
Campbell, Norm R C; Ordunez, Pedro; DiPette, Donald J; Giraldo, Gloria P; Angell, Sonia Y; Jaffe, Marc G; Lackland, Dan; Martinez, Ramón; Valdez, Yamilé; Maldonado Figueredo, Javier I; Paccot, Melanie; Santana, Maria J; Whelton, Paul K
2018-06-01
The Pan American Health Organization (PAHO)-World Hypertension League (WHL) Hypertension Monitoring and Evaluation Framework is summarized. Standardized indicators are provided for monitoring and evaluating national or subnational hypertension control programs. Five core indicators from the World Health Organization hearts initiative and a single PAHO-WHL core indicator are recommended to be used in all hypertension control programs. In addition, hypertension control programs are encouraged to select from 14 optional qualitative and 33 quantitative indicators to facilitate progress towards enhanced hypertension control. The intention is for hypertension programs to select quantitative indicators based on the current surveillance mechanisms that are available and what is feasible and to use the framework process indicators as a guide to program management. Programs may wish to increase or refine the number of indicators they use over time. With adaption the indicators can also be implemented at a community or clinic level. The standardized indicators are being pilot tested in Cuba, Colombia, Chile, and Barbados. ©2018 Wiley Periodicals, Inc.
Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki
2017-06-01
Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.
Li, Yuan; Mao, Liangang; Yan, Dongdong; Ma, Taotao; Shen, Jin; Guo, Meixia; Wang, Qiuxia; Ouyang, Canbin; Cao, Aocheng
2014-11-01
Two soil fumigants, chloropicrin (CP) and dimethyl disulfide (DMDS), were used to control Fusarium wilt disease (FWD) which caused large economic losses in strawberries. The fumigants were evaluated alone and in combination in a laboratory study and in strawberry greenhouses. Laboratory tests found that combinations of CP and DMDS indicated a positive synergistic activity on Fusarium oxysporum. A newly developed quantitative assay for F. oxysporum involving real-time PCR was used successfully to evaluate F. oxysporum control by the fumigants; it provided similar results to the selective medium but was less time-consuming and less labor intensive. Greenhouse trials revealed that the combination of CP and DMDS successfully suppressed the incidence of FWD and sharply reduced the population density of F. oxysporum, which significantly increased fruit branch number and maintained a good strawberry yield, higher than methyl bromide (MB) treatment. All of the treatments provided significantly better results than the non-treated control. This study confirms that the newly developed real-time PCR quantitative assay for F. oxysporum was suitable for the control efficacy evaluation of soil fumigants and that the novel fumigant combination of CP and DMDS offers a promising effective alternative to MB for the control of F. oxysporum in strawberry greenhouses. © 2013 Society of Chemical Industry.
Colkesen, Ersen B; Niessen, Maurice AJ; Kraaijenhagen, Roderik A; Essink-Bot, Marie-Louise; Peek, Niels
2012-01-01
Background Web technology is increasingly being used to provide individuals with health risk assessments (HRAs) with tailored feedback. End-user satisfaction is an important determinant of the potential impact of HRAs, as this influences program attrition and adherence to behavioral advice. Objective The aim of this study was to evaluate end-user satisfaction with a web-based HRA with tailored feedback applied in worksite settings, using mixed (quantitative and qualitative) methods. Methods Employees of seven companies in the Netherlands participated in a commercial, web-based, HRA with tailored feedback. The HRA consisted of four components: 1) a health and lifestyle assessment questionnaire, 2) a biometric evaluation, 3) a laboratory evaluation, and 4) tailored feedback consisting of a personal health risk profile and lifestyle behavior advice communicated through a web portal. HRA respondents received an evaluation questionnaire after six weeks. Satisfaction with different parts of the HRA was measured on 5-point Likert scales. A free-text field provided the opportunity to make additional comments. Results In total, 2289 employees participated in the HRA program, of which 637 (27.8%) completed the evaluation questionnaire. Quantitative analysis showed that 85.6% of the respondents evaluated the overall HRA positively. The free-text field was filled in by 29.7 % of the respondents (189 out of 637), who made 315 separate remarks. Qualitative evaluation of these data showed that these respondents made critical remarks. Respondents felt restricted by the answer categories of the health and lifestyle assessment questionnaire, which resulted in the feeling that the corresponding feedback could be inadequate. Some respondents perceived the personal risk profile as unnecessarily alarming or suggested providing more explanations, reference values, and a justification of the behavioral advice given. Respondents also requested the opportunity to discuss the feedback with a health professional. Conclusions Most people were satisfied with the web-based HRA with tailored feedback. Sources of dissatisfaction were limited opportunities for providing additional health information outside of the predefined health and lifestyle assessment questionnaire and insufficient transparency on the generation of the feedback. Information regarding the aim and content of the HRA should be clear and accurate to prevent unrealistic expectations among end-users. Involving trusted health professionals in the implementation of web-based HRAs may enhance the use of and confidence in the HRA. PMID:23111097
Vosbergen, Sandra; Laan, Eva K; Colkesen, Ersen B; Niessen, Maurice A J; Kraaijenhagen, Roderik A; Essink-Bot, Marie-Louise; Peek, Niels
2012-10-30
Web technology is increasingly being used to provide individuals with health risk assessments (HRAs) with tailored feedback. End-user satisfaction is an important determinant of the potential impact of HRAs, as this influences program attrition and adherence to behavioral advice. The aim of this study was to evaluate end-user satisfaction with a web-based HRA with tailored feedback applied in worksite settings, using mixed (quantitative and qualitative) methods. Employees of seven companies in the Netherlands participated in a commercial, web-based, HRA with tailored feedback. The HRA consisted of four components: 1) a health and lifestyle assessment questionnaire, 2) a biometric evaluation, 3) a laboratory evaluation, and 4) tailored feedback consisting of a personal health risk profile and lifestyle behavior advice communicated through a web portal. HRA respondents received an evaluation questionnaire after six weeks. Satisfaction with different parts of the HRA was measured on 5-point Likert scales. A free-text field provided the opportunity to make additional comments. In total, 2289 employees participated in the HRA program, of which 637 (27.8%) completed the evaluation questionnaire. Quantitative analysis showed that 85.6% of the respondents evaluated the overall HRA positively. The free-text field was filled in by 29.7 % of the respondents (189 out of 637), who made 315 separate remarks. Qualitative evaluation of these data showed that these respondents made critical remarks. Respondents felt restricted by the answer categories of the health and lifestyle assessment questionnaire, which resulted in the feeling that the corresponding feedback could be inadequate. Some respondents perceived the personal risk profile as unnecessarily alarming or suggested providing more explanations, reference values, and a justification of the behavioral advice given. Respondents also requested the opportunity to discuss the feedback with a health professional. Most people were satisfied with the web-based HRA with tailored feedback. Sources of dissatisfaction were limited opportunities for providing additional health information outside of the predefined health and lifestyle assessment questionnaire and insufficient transparency on the generation of the feedback. Information regarding the aim and content of the HRA should be clear and accurate to prevent unrealistic expectations among end-users. Involving trusted health professionals in the implementation of web-based HRAs may enhance the use of and confidence in the HRA.
A online credit evaluation method based on AHP and SPA
NASA Astrophysics Data System (ADS)
Xu, Yingtao; Zhang, Ying
2009-07-01
Online credit evaluation is the foundation for the establishment of trust and for the management of risk between buyers and sellers in e-commerce. In this paper, a new credit evaluation method based on the analytic hierarchy process (AHP) and the set pair analysis (SPA) is presented to determine the credibility of the electronic commerce participants. It solves some of the drawbacks found in classical credit evaluation methods and broadens the scope of current approaches. Both qualitative and quantitative indicators are considered in the proposed method, then a overall credit score is achieved from the optimal perspective. In the end, a case analysis of China Garment Network is provided for illustrative purposes.
Performance evaluation methodology for historical document image binarization.
Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis
2013-02-01
Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.
Development and application of the RE-AIM QuEST mixed methods framework for program evaluation.
Forman, Jane; Heisler, Michele; Damschroder, Laura J; Kaselitz, Elizabeth; Kerr, Eve A
2017-06-01
To increase the likelihood of successful implementation of interventions and promote dissemination across real-world settings, it is essential to evaluate outcomes related to dimensions other than Effectiveness alone. Glasgow and colleagues' RE-AIM framework specifies four additional types of outcomes that are important to decision-makers: Reach, Adoption, Implementation (including cost), and Maintenance. To further strengthen RE-AIM, we propose integrating qualitative assessments in an expanded framework: RE-AIM Qualitative Evaluation for Systematic Translation (RE-AIM QuEST), a mixed methods framework. RE-AIM QuEST guides formative evaluation to identify real-time implementation barriers and explain how implementation context may influence translation to additional settings. RE-AIM QuEST was used to evaluate a pharmacist-led hypertension management intervention at 3 VA facilities in 2008-2009. We systematically reviewed each of the five RE-AIM dimensions and created open-ended companion questions to quantitative measures and identified qualitative and quantitative data sources, measures, and analyses. To illustrate use of the RE-AIM QuEST framework, we provide examples of real-time, coordinated use of quantitative process measures and qualitative methods to identify site-specific issues, and retrospective use of these data sources and analyses to understand variation across sites and explain outcomes. For example, in the Reach dimension, we conducted real-time measurement of enrollment across sites and used qualitative data to better understand and address barriers at a low-enrollment site. The RE-AIM QuEST framework may be a useful tool for improving interventions in real-time, for understanding retrospectively why an intervention did or did not work, and for enhancing its sustainability and translation to other settings.
Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2017-01-30
Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.
Towards evidence-based practice in medical training: making evaluations more meaningful.
Drescher, Uta; Warren, Fiona; Norton, Kingsley
2004-12-01
The evaluation of training is problematic and the evidence base inconclusive. This situation may arise for 2 main reasons: training is not understood as a complex intervention and, related to this, the evaluation methods applied are often overly simplistic. This paper makes the case for construing training, especially in the field of specialist medical education, as a complex intervention. It also selectively reviews the available literature in order to match evaluative techniques with the demonstrated complexity. Construing training as a complex intervention can provide a framework for selecting the most appropriate methodology to evaluate a given training intervention and to appraise the evidence base for training fairly, choosing from among both quantitative and qualitative approaches and applying measurement at multiple levels of training impact.
Evaluating digital libraries in the health sector. Part 2: measuring impacts and outcomes.
Cullen, Rowena
2004-03-01
This is the second part of a two-part paper which explores methods that can be used to evaluate digital libraries in the health sector. Part 1 focuses on approaches to evaluation that have been proposed for mainstream digital information services. This paper investigates evaluative models developed for some innovative digital library projects, and some major national and international electronic health information projects. The value of ethnographic methods to provide qualitative data to explore outcomes, adding to quantitative approaches based on inputs and outputs is discussed. The paper concludes that new 'post-positivist' models of evaluation are needed to cover all the dimensions of the digital library in the health sector, and some ways of doing this are outlined.
Medical privacy protection based on granular computing.
Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng
2004-10-01
Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.
Gámez-Cenzano, Cristina; Pino-Sorroche, Francisco
2014-04-01
There is a growing interest in using quantification in FDG-PET/CT in oncology, especially for evaluating response to therapy. Complex full quantitative procedures with blood sampling and dynamic scanning have been clinically replaced by the use of standardized uptake value measurements that provide an index of regional tracer uptake normalized to the administered dose of FDG. Some approaches have been proposed for assessing quantitative metabolic response, such as EORTC and PERCIST criteria in solid tumors. When using standardized uptake value in clinical routine and multicenter trials, standardization of protocols and quality control procedures of instrumentation is required. Copyright © 2014 Elsevier Inc. All rights reserved.
Quantitative experiments to explain the change of seasons
NASA Astrophysics Data System (ADS)
Testa, Italo; Busarello, Gianni; Puddu, Emanuella; Leccia, Silvio; Merluzzi, Paola; Colantonio, Arturo; Moretti, Maria Ida; Galano, Silvia; Zappia, Alessandro
2015-03-01
The science education literature shows that students have difficulty understanding what causes the seasons. Incorrect explanations are often due to a lack of knowledge about the physical mechanisms underlying this phenomenon. To address this, we present a module in which the students engage in quantitative measurements with a photovoltaic panel to explain changes to the sunray flow on Earth’s surface over the year. The activities also provide examples of energy transfers between the incoming radiation and the environment to introduce basic features of Earth’s climate. The module was evaluated with 45 secondary school students (aged 17-18) and a pre-/post-test research design. Analysis of students’ learning outcomes supports the effectiveness of the proposed activities.
Aggregating post-publication peer reviews and ratings
Florian, Răzvan V.
2012-01-01
Allocating funding for research often entails the review of the publications authored by a scientist or a group of scientists. For practical reasons, in many cases this review cannot be performed by a sufficient number of specialists in the core domain of the reviewed publications. In the meanwhile, each scientist reads thoroughly, on average, about 88 scientific articles per year, and the evaluative information that scientists can provide about these articles is currently lost. I suggest that aggregating in an online database reviews or ratings on the publications that scientists read anyhow can provide important information that can revolutionize the evaluation processes that support funding decisions. I also suggest that such aggregation of reviews can be encouraged by a system that would provide a publicly available review portfolio for each scientist, without prejudicing the anonymity of reviews. I provide some quantitative estimates on the number and distribution of reviews and ratings that can be obtained. PMID:22661941
Assaying macrophage activity in a murine model of inflammatory bowel disease using fluorine-19 MRI
Kadayakkara, Deepak K; Ranganathan, Sarangarajan; Young, Won-Bin; Ahrens, Eric T
2012-01-01
Macrophages have an important role in the pathogenesis of most chronic inflammatory diseases. A means of non-invasively quantifying macrophage migration would contribute significantly towards our understanding of chronic inflammatory processes and aid the evaluation of novel therapeutic strategies. We describe the use of a perfluorocarbon tracer reagent and in vivo 19F magnetic resonance imaging (MRI) to quantify macrophage burden longitudinally. We apply these methods to evaluate the severity and three-dimensional distribution of macrophages in a murine model of inflammatory bowel disease (IBD). MRI results were validated by histological analysis, immunofluorescence and quantitative real-time polymerase chain reaction. Selective depletion of macrophages in vivo was also performed, further validating that macrophage accumulation of perfluorocarbon tracers was the basis of 19F MRI signals observed in the bowel. We tested the effects of two common clinical drugs, dexamethasone and cyclosporine A, on IBD progression. Whereas cyclosporine A provided mild therapeutic effect, unexpectedly dexamethasone enhanced colon inflammation, especially in the descending colon. Overall, 19F MRI can be used to evaluate early-stage inflammation in IBD and is suitable for evaluating putative therapeutics. Due to its high macrophage specificity and quantitative ability, we envisage 19F MRI having an important role in evaluating a wide range of chronic inflammatory conditions mediated by macrophages. PMID:22330343
Quantitative image quality evaluation of MR images using perceptual difference models
Miao, Jun; Huo, Donglai; Wilson, David L.
2008-01-01
The authors are using a perceptual difference model (Case-PDM) to quantitatively evaluate image quality of the thousands of test images which can be created when optimizing fast magnetic resonance (MR) imaging strategies and reconstruction techniques. In this validation study, they compared human evaluation of MR images from multiple organs and from multiple image reconstruction algorithms to Case-PDM and similar models. The authors found that Case-PDM compared very favorably to human observers in double-stimulus continuous-quality scale and functional measurement theory studies over a large range of image quality. The Case-PDM threshold for nonperceptible differences in a 2-alternative forced choice study varied with the type of image under study, but was ≈1.1 for diffuse image effects, providing a rule of thumb. Ordering the image quality evaluation models, we found in overall Case-PDM ≈ IDM (Sarnoff Corporation) ≈ SSIM [Wang et al. IEEE Trans. Image Process. 13, 600–612 (2004)] > mean squared error ≈ NR [Wang et al. (2004) (unpublished)] > DCTune (NASA) > IQM (MITRE Corporation). The authors conclude that Case-PDM is very useful in MR image evaluation but that one should probably restrict studies to similar images and similar processing, normally not a limitation in image reconstruction studies. PMID:18649487
NASA Astrophysics Data System (ADS)
Eck, Brendan L.; Fahmi, Rachid; Levi, Jacob; Fares, Anas; Wu, Hao; Li, Yuemeng; Vembar, Mani; Dhanantwari, Amar; Bezerra, Hiram G.; Wilson, David L.
2016-03-01
Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3+/-19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5+/-28.3mL/min/100g and 78.3+/-25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.
Evaluation of background parenchymal enhancement on breast MRI: a systematic review
Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto
2017-01-01
Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480
Design and control of RUPERT: a device for robotic upper extremity repetitive therapy.
Sugar, Thomas G; He, Jiping; Koeneman, Edward J; Koeneman, James B; Herman, Richard; Huang, H; Schultz, Robert S; Herring, D E; Wanberg, J; Balasubramanian, Sivakumar; Swenson, Pete; Ward, Jeffrey A
2007-09-01
The structural design, control system, and integrated biofeedback for a wearable exoskeletal robot for upper extremity stroke rehabilitation are presented. Assisted with clinical evaluation, designers, engineers, and scientists have built a device for robotic assisted upper extremity repetitive therapy (RUPERT). Intense, repetitive physical rehabilitation has been shown to be beneficial overcoming upper extremity deficits, but the therapy is labor intensive and expensive and difficult to evaluate quantitatively and objectively. The RUPERT is developed to provide a low cost, safe and easy-to-use, robotic-device to assist the patient and therapist to achieve more systematic therapy at home or in the clinic. The RUPERT has four actuated degrees-of-freedom driven by compliant and safe pneumatic muscles (PMs) on the shoulder, elbow, and wrist. They are programmed to actuate the device to extend the arm and move the arm in 3-D space. It is very important to note that gravity is not compensated and the daily tasks are practiced in a natural setting. Because the device is wearable and lightweight to increase portability, it can be worn standing or sitting providing therapy tasks that better mimic activities of daily living. The sensors feed back position and force information for quantitative evaluation of task performance. The device can also provide real-time, objective assessment of functional improvement. We have tested the device on stroke survivors performing two critical activities of daily living (ADL): reaching out and self feeding. The future improvement of the device involves increased degrees-of-freedom and interactive control to adapt to a user's physical conditions.
Hayes Lane, Susan; Serafica, Reimund; Huffman, Carolyn; Cuddy, Alyssa
2016-01-01
In the current healthcare environment, nurses must have a basic understanding of research to lead change and implement evidence-based practice. The purpose of this study was to evaluate the effectiveness of an educational intervention formulated on the framework of the Great American Cookie Experiment measuring nurses' research knowledge, attitudes, and practice using mobile device gaming. This multisite quantitative study provides insight into promotion of research and information about best practices on innovative teaching strategies for nurses.
Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.
1978-01-01
Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., a “measurement” shall be defined as an evaluation where quantitative data is obtained (e.g... evaluation where no quantitative data is obtained (e.g., a check mark indicating conformance is usually the...
Code of Federal Regulations, 2014 CFR
2014-01-01
..., a “measurement” shall be defined as an evaluation where quantitative data is obtained (e.g... evaluation where no quantitative data is obtained (e.g., a check mark indicating conformance is usually the...
Code of Federal Regulations, 2010 CFR
2010-01-01
..., a “measurement” shall be defined as an evaluation where quantitative data is obtained (e.g... evaluation where no quantitative data is obtained (e.g., a check mark indicating conformance is usually the...
Code of Federal Regulations, 2012 CFR
2012-01-01
..., a “measurement” shall be defined as an evaluation where quantitative data is obtained (e.g... evaluation where no quantitative data is obtained (e.g., a check mark indicating conformance is usually the...
Code of Federal Regulations, 2011 CFR
2011-01-01
..., a “measurement” shall be defined as an evaluation where quantitative data is obtained (e.g... evaluation where no quantitative data is obtained (e.g., a check mark indicating conformance is usually the...
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines
Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.
2015-01-01
Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines.
Peterson, Brittni M; Mermelstein, Paul G; Meisel, Robert L
2015-03-15
Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sun, Chi-Kuang; Wei, Ming-Liang; Su, Yu-Hsiang; Weng, Wei-Hung; Liao, Yi-Hua
2017-02-01
Harmonic generation microscopy is a noninvasive repetitive imaging technique that provides real-time 3D microscopic images of human skin with a sub-femtoliter resolution and high penetration down to the reticular dermis. In this talk, we show that with a strong resonance effect, the third-harmonic-generation (THG) modality provides enhanced contrast on melanin and allows not only differential diagnosis of various pigmented skin lesions but also quantitative imaging for longterm tracking. This unique capability makes THG microscopy the only label-free technique capable of identifying the active melanocytes in human skin and to image their different dendriticity patterns. In this talk, we will review our recent efforts to in vivo image melanin distribution and quantitatively diagnose pigmented skin lesions using label-free harmonic generation biopsy. This talk will first cover the spectroscopic study on the melanin enhanced THG effect in human cells and the calibration strategy inside human skin for quantitative imaging. We will then review our recent clinical trials including: differential diagnosis capability study on pigmented skin tumors; as well as quantitative virtual biopsy study on pre- and post- treatment evaluation on melasma and solar lentigo. Our study indicates the unmatched capability of harmonic generation microscopy to perform virtual biopsy for noninvasive histopathological diagnosis of various pigmented skin tumors, as well as its unsurpassed capability to noninvasively reveal the pathological origin of different hyperpigmentary diseases on human face as well as to monitor the efficacy of laser depigmentation treatments. This work is sponsored by National Health Research Institutes.
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.
2014-04-01
Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.
Lassere, Marissa N; Johnson, Kent R; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G; Ostergaard, Mikkel; Maksymowych, Walter P; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George
2007-03-01
There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to develop a hierarchical schema that systematically evaluates and ranks the surrogacy status of biomarkers and surrogates; and to obtain feedback from stakeholders. After a systematic search of Medline and Embase on biomarkers, surrogate (outcomes, endpoints, markers, indicators), intermediate endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery, development, and approval.
Wilson, Annabelle M; Magarey, Anthea M; Dollman, James; Jones, Michelle; Mastersson, Nadia
2010-08-01
To describe the rationale, development and implementation of the quantitative component of evaluation of a multi-setting, multi-strategy, community-based childhood obesity prevention project (the eat well be active (ewba) Community Programs) and the challenges associated with this process and some potential solutions. ewba has a quasi-experimental design with intervention and comparison communities. Baseline data were collected in 2006 and post-intervention measures will be taken from a non-matched cohort in 2009. Schoolchildren aged 10-12 years were chosen as one litmus group for evaluation purposes. Thirty-nine primary schools in two metropolitan and two rural communities in South Australia. A total of 1732 10-12-year-old school students completed a nutrition and/or a physical activity questionnaire and 1637 had anthropometric measures taken; 983 parents, 286 teachers, thirty-six principals, twenty-six canteen and thirteen out-of-school-hours care (OSHC) workers completed Program-specific questionnaires developed for each of these target groups. The overall child response rate for the study was 49 %. Sixty-five per cent, 43 %, 90 %, 90 % and 68 % of parent, teachers, principals, canteen and OSHC workers respectively, completed and returned questionnaires. A number of practical, logistical and methodological challenges were experienced when undertaking this data collection. Learnings from the process of quantitative baseline data collection for the ewba Community Programs can provide insights for other researchers planning similar studies with similar methods, particularly those evaluating multi-strategy programmes across multiple settings.
Oostendorp, Corien; Meyer, Sarah; Sobrio, Monia; van Arendonk, Joyce; Reichmann, Ernst; Daamen, Willeke F; van Kuppevelt, Toin H
2017-05-01
Treatment of full-thickness skin defects with split-thickness skin grafts is generally associated with contraction and scar formation and cellular skin substitutes have been developed to improve skin regeneration. The evaluation of cultured skin substitutes is generally based on qualitative parameters focusing on histology. In this study we focused on quantitative evaluation to provide a template for comparison of human bio-engineered skin substitutes between clinical and/or research centers, and to supplement histological data. We focused on extracellular matrix proteins since these components play an important role in skin regeneration. As a model we analyzed the human dermal substitute denovoDerm and the dermo-epidermal skin substitute denovoSkin. The quantification of the extracellular matrix proteins type III collagen and laminin 5 in tissue homogenates using western blotting analysis and ELISA was not successful. The same was true for assaying lysyl oxidase, an enzyme involved in crosslinking of matrix molecules. As an alternative, gene expression levels were measured using qPCR. Various RNA isolation procedures were probed. The gene expression profile for specific dermal and epidermal genes could be measured reliably and reproducibly. Differences caused by changes in the cell culture conditions could easily be detected. The number of cells in the skin substitutes was measured using the PicoGreen dsDNA assay, which was found highly quantitative and reproducible. The (dis) advantages of assays used for quantitative evaluation of skin substitutes are discussed. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
NASA Astrophysics Data System (ADS)
Prabhat, Prashant; Peet, Michael; Erdogan, Turan
2016-03-01
In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).
NASA Astrophysics Data System (ADS)
Mansfield, C. D.; Rutt, H. N.
2002-02-01
The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.
NASA Astrophysics Data System (ADS)
Xie, Yijing; Thom, Maria; Ebner, Michael; Wykes, Victoria; Desjardins, Adrien; Miserocchi, Anna; Ourselin, Sebastien; McEvoy, Andrew W.; Vercauteren, Tom
2017-11-01
In high-grade glioma surgery, tumor resection is often guided by intraoperative fluorescence imaging. 5-aminolevulinic acid-induced protoporphyrin IX (PpIX) provides fluorescent contrast between normal brain tissue and glioma tissue, thus achieving improved tumor delineation and prolonged patient survival compared with conventional white-light-guided resection. However, commercially available fluorescence imaging systems rely solely on visual assessment of fluorescence patterns by the surgeon, which makes the resection more subjective than necessary. We developed a wide-field spectrally resolved fluorescence imaging system utilizing a Generation II scientific CMOS camera and an improved computational model for the precise reconstruction of the PpIX concentration map. In our model, the tissue's optical properties and illumination geometry, which distort the fluorescent emission spectra, are considered. We demonstrate that the CMOS-based system can detect low PpIX concentration at short camera exposure times, while providing high-pixel resolution wide-field images. We show that total variation regularization improves the contrast-to-noise ratio of the reconstructed quantitative concentration map by approximately twofold. Quantitative comparison between the estimated PpIX concentration and tumor histopathology was also investigated to further evaluate the system.
Cells and Stripes: A novel quantitative photo-manipulation technique
Mistrik, Martin; Vesela, Eva; Furst, Tomas; Hanzlikova, Hana; Frydrych, Ivo; Gursky, Jan; Majera, Dusana; Bartek, Jiri
2016-01-01
Laser micro-irradiation is a technology widely used in the DNA damage response, checkpoint signaling, chromatin remodeling and related research fields, to assess chromatin modifications and recruitment of diverse DNA damage sensors, mediators and repair proteins to sites of DNA lesions. While this approach has aided numerous discoveries related to cell biology, maintenance of genome integrity, aging and cancer, it has so far been limited by a tedious manual definition of laser-irradiated subcellular regions, with the ensuing restriction to only a small number of cells treated and analyzed in a single experiment. Here, we present an improved and versatile alternative to the micro-irradiation approach: Quantitative analysis of photo-manipulated samples using innovative settings of standard laser-scanning microscopes. Up to 200 cells are simultaneously exposed to a laser beam in a defined pattern of collinear rays. The induced striation pattern is then automatically evaluated by a simple algorithm, which provides a quantitative assessment of various laser-induced phenotypes in live or fixed cells. Overall, this new approach represents a more robust alternative to existing techniques, and provides a versatile tool for a wide range of applications in biomedicine. PMID:26777522
I PREPARE: development and clinical utility of an environmental exposure history mnemonic.
Paranzino, Grace K; Butterfield, Patricia; Nastoff, Teresa; Ranger, Cherryll
2005-01-01
The I PREPARE environmental exposure history mnemonic is a quick reference tool created for primary care providers. Health care providers (N = 159) were asked to evaluate a prototype mnemonic, to suggest new health history questions, and to propose the deletion of less relevant questions. The goal of this evaluation was to create a practical and clinically relevant mnemonic, rather than to obtain quantitative estimates of validity. The final I PREPARE mnemonic cues the provider to "Investigate potential exposures;" ask questions related to "Present work," "Residence," "Environmental concerns," "Past work," and "Activities;" provide "Referrals and resources;" and "Educate" the patient by reviewing a checklist of strategies to prevent or minimize exposures. The sequence of I PREPARE makes intuitive sense by cueing the provider to ask specific questions and provide educational materials to the patient. National improvements in the quality of environmental exposure history data are predicated in part on the creation of simple and convenient tools for use in clinical practice.
Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data
ERIC Educational Resources Information Center
Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny
2012-01-01
This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…
Research on the use of data fusion technology to evaluate the state of electromechanical equipment
NASA Astrophysics Data System (ADS)
Lin, Lin
2018-04-01
Aiming at the problems of different testing information modes and the coexistence of quantitative and qualitative information in the state evaluation of electromechanical equipment, the paper proposes the use of data fusion technology to evaluate the state of electromechanical equipment. This paper introduces the state evaluation process of mechanical and electrical equipment in detail, uses the D-S evidence theory to fuse the decision-making layers of mechanical and electrical equipment state evaluation and carries out simulation tests. The simulation results show that it is feasible and effective to apply the data fusion technology to the state evaluation of the mechatronic equipment. After the multiple decision-making information provided by different evaluation methods are fused repeatedly and the useful information is extracted repeatedly, the fuzziness of judgment can be reduced and the state evaluation Credibility.
Evaluation of osteoarthritis progression using polarization-sensitive optical coherence tomography
NASA Astrophysics Data System (ADS)
Nassif, Nader A.; Pierce, Mark C.; Park, B. Hyle; Cense, Barry; de Boer, Johannes F.
2004-07-01
Osteoarthritis is a prevalent medical condition that presents a diagnostic and therapeutic challenge to physicians today because of the inability to assess the integrity of the articular cartilage early in the disease. Polarization sensitive optical coherence tomography (PS-OCT) is a high resolution, non-contact imaging modality that provides cross-sectional images with additional information regarding the integrity of the collagen matrix. Using PS-OCT to image provides information regarding thickness of the articular cartilage and gives an index of biochemical changes based on alterations in optical properties (i.e. birefringence) of the tissue. We demonstrate initial experiments performed on specimens collected following total knee replacement surgery. Articular cartilage was imaged using a 1310 nm PS-OCT system where both intensity and phase images were acquired. PS-OCT images were compared with histology, and the changes in tissue optical properties were characterized. Analysis of the intensity images demonstrates differences between healthy and diseased cartilage surface and thickness. Phase maps of the tissue demonstrated distinct differences between healthy and diseased tissue. PS-OCT was able to image a gradual loss of birefringence as the tissue became more diseased. In this way, determining the rate of change of the phase provides a quantitative measure of pathology. Thus, imaging and evaluation of osteoarthritis using PS-OCT can be a useful means of quantitative assessment of the disease.
Tin-tungsten mineralizing processes in tungsten vein deposits: Panasqueira, Portugal
NASA Astrophysics Data System (ADS)
Lecumberri-Sanchez, P.; Pinto, F.; Vieira, R.; Wälle, M.; Heinrich, C. A.
2015-12-01
Tungsten has a high heat resistance, density and hardness, which makes it widely applied in industry (e.g. steel, tungsten carbides). Tungsten deposits are typically magmatic-hydrothermal systems. Despite the economic significance of tungsten, there are no modern quantitative analytical studies of the fluids responsible for the formation of its highest-grade deposit type (tungsten vein deposits). Panasqueira (Portugal) is a tungsten vein deposit, one of the leading tungsten producers in Europe and one of the best geologically characterized tungsten vein deposits. In this study, compositions of the mineralizing fluids at Panasqueira have been determined through combination of detailed petrography, microthermometric measurements and LA-ICPMS analyses, and geochemical modeling has been used to determine the processes that lead to tungsten mineralization. We characterized the fluids related to the various mineralizing stages in the system: the oxide stage (tin and tungsten mineralization), the sulfide stage (chalcopyrite and sphalerite mineralization) and the carbonate stage. Thus, our results provide information on the properties of fluids related with specific paragenetic stages. Furthermore we used those fluid compositions in combination with host rock mineralogy and chemistry to evaluate which are the controlling factors in the mineralizing process. This study provides the first quantitative analytical data on fluid composition for tungsten vein deposits and evaluates the controlling mineralization processes helping to determine the mechanisms of formation of the Panasqueira tin-tungsten deposit and providing additional geochemical constraints on the local distribution of mineralization.
Jian, Jianbo; Zhang, Wenxue; Yang, Hao; Zhao, Xinyan; Xuan, Ruijiao; Li, Dongyue; Hu, Chunhong
2017-01-01
Capillarization of sinusoids and change of trabecular thickness are the main histologic features in hepatocellular carcinoma (HCC). Of particular interest are the three-dimensional (3D) visualization and quantitative evaluation of such alterations in the HCC progression. X-ray phase-contrast computed tomography (PCCT) is an emerging imaging method that provides excellent image contrast for soft tissues. This study aimed to explore the potential of in-line PCCT in microstructure imaging of capillarized sinusoids and trabecular structure in human HCC tissues and to quantitatively evaluate the alterations of those fine structures during the development of HCC. This project was designed as an ex vivo experimental study. The study was approved by the institutional review board, and informed consent was obtained from the patients. Eight human resected HCC tissue samples were imaged using in-line PCCT. After histologic processing, PCCT images and histopathologic data were matched. Fine structures in HCC tissues were revealed. Quantitative analyses of capillarized sinusoids (ie, percentage of sinusoidal area [PSA], sinusoidal volume) and trabecular structure (ie, trabecular thickness, surface-area-to-volume ratio [SA/V]) in low-grade (well or moderately differentiated) and high-grade (poorly differentiated) HCC groups were performed. Using PCCT, the alterations of capillarized sinusoids and trabecular structure were clearly observed in 3D geometry, which was confirmed by the corresponding histologic sections. The 3D qualitative analyses of sinusoids in the high-grade HCC group were significantly different (P < 0.05) in PSA (7.8 ± 2.5%) and sinusoidal volume (2.9 ± 0.6 × 10 7 µm 3 ) from those in the low-grade HCC group (PSA, 12.9 ± 2.2%; sinusoidal volume, 2.4 ± 0.3 × 10 7 µm 3 ). Moreover, the 3D quantitative evaluation of the trabecular structure in the high-grade HCC group showed a significant change (P < 0.05) in the trabecular thickness (87.8 ± 15.6 µm) and SA/V (2.2 ± 1.3 × 10 3 µm - 1 ) compared to the low-grade HCC group (trabecular thickness, 75.9 ± 7.1 µm; SA/V, 7.5 ± 1.3 × 10 3 µm - 1 ). This study provides insights into the 3D alterations of microstructures such as capillarized sinusoids and the trabecular structure at a micrometer level, which might allow for an improved understanding of the development of HCC. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Addison, Prue F E; Flander, Louisa B; Cook, Carly N
2017-08-01
Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evaluation of the Earth System CoG Infrastructure in Supporting a Model Intercomparison Project
NASA Astrophysics Data System (ADS)
Wallis, J. C.; Rood, R. B.; Murphy, S.; Cinquini, L.; DeLuca, C.
2013-12-01
Earth System CoG is a web-based collaboration environment that combines data services with metadata and project management services. The environment is particularly suited to support software development and model intercomparison projects. CoG was recently used to support the National Climate Predictions and Projections Platform (NCPP) Quantitative Evaluation of Downscaling (QED-2013) workshop. QED-2013 was a workshop with a community approach for the objective, quantitative evaluation of techniques to downscale climate model predictions and projections. This paper will present a brief introduction to CoG, QED-2013, and findings from an ethnographic evaluation of how CoG supported QED-2013. The QED-2013 workshop focused on real-world application problems drawn from several sectors, and contributed to the informed use of downscaled data. This workshop is a part of a larger effort by NCPP and partner organizations to develop a standardized evaluation framework for local and regional climate information. The main goals of QED-2013 were to a) coordinate efforts for quantitative evaluation, b) develop software infrastructure, c) develop a repository of information, d) develop translational and guidance information, e) identify and engage key user communities, and f) promote collaboration and interoperability. CoG was a key player in QED-2013 support. NCPP was an early adopter of the CoG platform, providing valuable recommendations for overall development plus specific workshop-related requirements. New CoG features developed for QED-2013 included: the ability to publish images and associated metadata contained within XML files to its associated data node combine both artifacts into an integrated display. The ability to modify data search facets into scientifically relevant groups and display dynamic lists of workshop participants and their interests was also added to the interface. During the workshop, the QED-2013 project page on CoG provided meeting logistics, meeting materials, shared spaces and resources, and data services. The evaluation of CoG tools was focused on the usability of products rather than metrics, such as number of independent hits to a web site. We wanted to know how well CoG tools supported the workshop participants and their tasks. For instance, what workshop tasks could be performed within the CoG environment? Were these tasks performed there or with alternative tools? And do participants plan to use the tools after the workshop for other projects? Ultimately, we wanted to know if CoG contributed to NCPP's need for a flexible and extensible evaluation platform, and did it support the integration of dispersed resources, quantitative evaluation of climate projections, and the generation and management of interpretive information. Evaluation of the workshop and activity occurred during, at the end of, and after the workshop. During the workshop, an ethnographer observed and participated in the workshop, and collected short, semi-structured interviews with a subset of the participants. At the end of the workshop, an exit survey was administered to all the participants. After the workshop, a variety of methods were used to capture the impact of the workshop.
Phillips, Brett T; Fourman, Mitchell S; Rivara, Andrew; Dagum, Alexander B; Huston, Tara L; Ganz, Jason C; Bui, Duc T; Khan, Sami U
2014-01-01
Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R(2) = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8.
Fourman, Mitchell S.; Rivara, Andrew; Dagum, Alexander B.; Huston, Tara L.; Ganz, Jason C.; Bui, Duc T.; Khan, Sami U.
2014-01-01
Objective: Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Methods: Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. Results: 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R2 = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Conclusion: Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8. PMID:25525483
Prediction of trabecular bone qualitative properties using scanning quantitative ultrasound
Qin, Yi-Xian; Lin, Wei; Mittra, Erik; Xia, Yi; Cheng, Jiqi; Judex, Stefan; Rubin, Clint; Müller, Ralph
2012-01-01
Microgravity induced bone loss represents a critical health problem in astronauts, particularly occurred in weight-supporting skeleton, which leads to osteopenia and increase of fracture risk. Lack of suitable evaluation modality makes it difficult for monitoring skeletal status in long term space mission and increases potential risk of complication. Such disuse osteopenia and osteoporosis compromise trabecular bone density, and architectural and mechanical properties. While X-ray based imaging would not be practical in space, quantitative ultrasound may provide advantages to characterize bone density and strength through wave propagation in complex trabecular structure. This study used a scanning confocal acoustic diagnostic and navigation system (SCAN) to evaluate trabecular bone quality in 60 cubic trabecular samples harvested from adult sheep. Ultrasound image based SCAN measurements in structural and strength properties were validated by μCT and compressive mechanical testing. This result indicated a moderately strong negative correlations observed between broadband ultrasonic attenuation (BUA) and μCT-determined bone volume fraction (BV/TV, R2=0.53). Strong correlations were observed between ultrasound velocity (UV) and bone’s mechanical strength and structural parameters, i.e., bulk Young’s modulus (R2=0.67) and BV/TV (R2=0.85). The predictions for bone density and mechanical strength were significantly improved by using a linear combination of both BUA and UV, yielding R2=0.92 for BV/TV and R2=0.71 for bulk Young’s modulus. These results imply that quantitative ultrasound can characterize trabecular structural and mechanical properties through measurements of particular ultrasound parameters, and potentially provide an excellent estimation for bone’s structural integrity. PMID:23976803
Indicators of Family Care for Development for Use in Multicountry Surveys
Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima
2012-01-01
Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914
Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin
2016-01-01
Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884
Prediction of trabecular bone qualitative properties using scanning quantitative ultrasound
NASA Astrophysics Data System (ADS)
Qin, Yi-Xian; Lin, Wei; Mittra, Erik; Xia, Yi; Cheng, Jiqi; Judex, Stefan; Rubin, Clint; Müller, Ralph
2013-11-01
Microgravity induced bone loss represents a critical health problem in astronauts, particularly occurred in weight-supporting skeleton, which leads to osteopenia and increase of fracture risk. Lack of suitable evaluation modality makes it difficult for monitoring skeletal status in long term space mission and increases potential risk of complication. Such disuse osteopenia and osteoporosis compromise trabecular bone density, and architectural and mechanical properties. While X-ray based imaging would not be practical in space, quantitative ultrasound may provide advantages to characterize bone density and strength through wave propagation in complex trabecular structure. This study used a scanning confocal acoustic diagnostic and navigation system (SCAN) to evaluate trabecular bone quality in 60 cubic trabecular samples harvested from adult sheep. Ultrasound image based SCAN measurements in structural and strength properties were validated by μCT and compressive mechanical testing. This result indicated a moderately strong negative correlations observed between broadband ultrasonic attenuation (BUA) and μCT-determined bone volume fraction (BV/TV, R2=0.53). Strong correlations were observed between ultrasound velocity (UV) and bone's mechanical strength and structural parameters, i.e., bulk Young's modulus (R2=0.67) and BV/TV (R2=0.85). The predictions for bone density and mechanical strength were significantly improved by using a linear combination of both BUA and UV, yielding R2=0.92 for BV/TV and R2=0.71 for bulk Young's modulus. These results imply that quantitative ultrasound can characterize trabecular structural and mechanical properties through measurements of particular ultrasound parameters, and potentially provide an excellent estimation for bone's structural integrity.
Lu, Pei-Pei; Ting, Shing-Shiang; Chen, Mei-Ling; Tang, Woung-Ru
2005-12-01
The purpose of this study is to discuss the historical context of qualitative and quantitative research so as to explain the principle of qualitative study and examine the positioning of nursing research within academic study as a whole. This paper guides the readers towards the historical context from empirical science, discusses the influences of qualitative and quantitative research on nursing research, then investigates the nature of research paradigms, examines the positioning of nursing research, which includes the characteristics of fields such as natural science, humanity and social studies, and science, and lastly, presents the research standard proposed by Yardley in 2000. The research paradigms include Positivism, Postpositivism, Criticism, and Constructivism, which can be compared with Ontology, Epistemology, and Methodology. The nature of the paradigm is to determine the assumption of the paradigm on the basis of Ontology, Epistemology, and Methodology. The paradigm determines how the researcher views the world and decides on what to answer, how to research, and how to answer. The difference in academic environment is reflected in the long-term dialogue between qualitative and quantitative studies, as well as the standard for criticism. This paper introduces the method of evaluation of the quality of qualitative study proposed by Yardley in 2002, namely the sensitivity of the context, the promise and conscientiousness, transparency and consistency, influence and significance. The paper is intended to provide a guideline for readers in evaluating the quality of qualitative study.
Using mixed methods to develop and evaluate an online weight management intervention.
Bradbury, Katherine; Dennison, Laura; Little, Paul; Yardley, Lucy
2015-02-01
This article illustrates the use of mixed methods in the development and evaluation of the Positive Online Weight Reduction (POWeR) programme, an e-health intervention designed to support sustainable weight loss. The studies outlined also explore how human support might enhance intervention usage and weight loss. Mixed methods were used to develop and evaluate POWeR. In the development phase, we drew on both quantitative and qualitative findings to plan and gain feedback on the intervention. Next, a feasibility trial, with nested qualitative study, explored what level of human support might lead to the most sustainable weight loss. Finally, a large community-based trial of POWeR, with nested qualitative study, explored whether the addition of brief telephone coaching enhances usage. Findings suggest that POWeR is acceptable and potentially effective. Providing human support enhanced usage in our trials, but was not unproblematic. Interestingly, there were some indications that more basic (brief) human support may produce more sustainable weight loss outcomes than more regular support. Qualitative interviews suggested that more regular support might foster reliance, meaning patients cannot sustain their weight losses when support ends. Qualitative findings in the community trial also suggested explanations for why many people may not take up the opportunity for human support. Integrating findings from both our qualitative and quantitative studies provided far richer insights than would have been gained using only a single method of inquiry. Further research should investigate the optimum delivery of human support needed to maximize sustainable weight loss in online interventions. Statement of contribution What is already known on this subject? There is evidence that human support may increase the effectiveness of e-health interventions. It is unclear what level of human support might be optimal or how human support improves effectiveness. Triangulation of quantitative and qualitative methods can be used to inform the design and implementation of interventions What does this study add? This paper demonstrates the value of a mixed methods approach when developing and evaluating an intervention. Qualitative methods provided complementary insights into the optimal level of human support. Brief human support is valued by some and may enhance usage and outcomes of an e-health intervention for weight loss. © 2014 The British Psychological Society.
Evaluation of patient centered medical home practice transformation initiatives.
Crabtree, Benjamin F; Chase, Sabrina M; Wise, Christopher G; Schiff, Gordon D; Schmidt, Laura A; Goyzueta, Jeanette R; Malouin, Rebecca A; Payne, Susan M C; Quinn, Michael T; Nutting, Paul A; Miller, William L; Jaén, Carlos Roberto
2011-01-01
The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewanjee, M.K.; Fuster, V.; Rao, S.A.
1983-05-01
A noninvasive technique has been developed in the dog model for imaging, with a gamma camera, the platelet deposition on Bjoerk-Shiley mitral valve prostheses early postoperatively. At 25 hours after implantation of the prosthesis and 24 hours after intravenous administration of 400 to 500 microCi of platelets labeled with indium-111, the platelet deposition in the sewing ring and perivalvular cardiac tissue can be clearly delineated in a scintiphotograph. An in vitro technique was also developed for quantitation of visceral microemboli in brain, lungs, kidneys, and other tissues. Biodistribution of the labeled platelets was quantitated, and the tissue/blood radioactivity ratio wasmore » determined in 22 dogs in four groups: unoperated normal dogs, sham-operated dogs, prosthesis-implanted dogs, and prosthesis-implanted dogs treated with dipyridamole before and aspirin and dipyridamole immediately after operation. Fifteen to 20% of total platelets were consumed as a consequence of the surgical procedure. On quantitation, we found that platelet deposition on the components of the prostheses was significantly reduced in prosthesis-implanted animals treated with dipyridamole and aspirin when compared with prosthesis-implanted, untreated dogs. All prosthesis-implanted animals considered together had a twofold to fourfold increase in tissue/blood radioactivity ratio in comparison with unoperated and sham-operated animals, an indication that the viscera work as filters and trap platelet microemboli that are presumably produced in the region of the mitral valve prostheses. In the dog model, indium-111-labeled platelets thus provide a sensitive marker for noninvasive imaging of platelet deposition on mechanical mitral valve prostheses, in vitro evaluation of platelet microembolism in viscera, in vitro quantitation of surgical consumption of platelets, and evaluation of platelet-inhibitor drugs.« less
2011-06-10
Sharan Merriam, there are six standard approaches to qualitative research : phenomenology , grounded theory , ethnography , narrative analysis, critical...69Merriam, Qualitative Research , 21-22. 70Grounded Theory Institute, ―What is Grounded Theory ,‖ http...as to the available methods of research , qualitative and quantitative, and why the qualitative methodology was selected. It also provided the reader
Gentry, S V; Powers, E F J; Azim, N; Maidrag, M
2018-07-01
Voluntary befriending schemes operate in many countries, promoting public health by supporting vulnerable individuals and families. Use of third sector and voluntary services to complement health and social care provision is increasingly important globally in the context of economic and demographic challenges, but the evidence base around such collaborations is limited. This article reports the results of operational evaluation research seeking to use robust routine work to generate transferable findings for use by those commissioning and providing services. The subject of our evaluation research is 'Home-Start Suffolk' (HSS) in Suffolk County, UK, an example of a third sector organisation commissioned to support the public health offer to local families. This evaluation research used the Donabedian framework, which assesses the structure, process and outcome in delivery of health services. Methods included a cross-sectional stakeholder survey with qualitative and quantitative elements (n = 96), qualitative interviews (n = 41) and quantitative analysis of the service's routine data (5740 visits) for the period from 01 July 2014 to 01 July 2016. Triangulation of data from each component revealed that HSS was perceived by diverse stakeholders to successfully support families in need of additional help. HSS service users perceived the service to offer greater flexibility, to be tailored to their needs and to be more trustworthy and supportive than statutory services. Volunteering with HSS enabled people to feel productive in their community and gain new skills. Managers of social care services perceived that HSS activity decreased burden on their staff. These benefits were facilitated through a long-standing organisational HSS structure and relationships between HSS and social care. Challenges posed by service provision by a third sector organisation included the need for volunteers to negotiate the boundary between being a friend and a professional outside of a professional framework. Quantitative analysis of impact was limited by the poor quality of routinely collected administrative data, highlighting the importance of planning processes for data collection with evaluation in mind. We believe that the results of this evaluation research provide transferrable lessons. They demonstrate how a third sector organisation with a long-standing structure and relationships with statutory services was able to reduce perceived service burden while also offering support in a more flexible and tailored way greatly valued by service users. Copyright © 2018 The Royal Society for Public Health. All rights reserved.
Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun
2014-04-04
Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.
2015-01-01
Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752
Ethnographic process evaluation in primary care: explaining the complexity of implementation.
Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine
2014-12-05
The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.
ERIC Educational Resources Information Center
Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan
2017-01-01
Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…
Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J
2014-05-15
Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.
Direct Measurements of the Convective Recycling of the Upper Troposphere
NASA Technical Reports Server (NTRS)
Bertram, Timothy H.; Perring, Anne E.; Wooldridge, Paul J.; Crounse, John D.; Kwan, Alan J.; Wennberg, Paul O.; Scheuer, Eric; Dibb, Jack; Avery, Melody; Sachse, Glen;
2007-01-01
We present a statistical representation of the aggregate effects of deep convection on the chemistry and dynamics of the Upper Troposphere (UT) based on direct aircraft observations of the chemical composition of the UT over the Eastern United States and Canada during summer. These measurements provide new and unique observational constraints on the chemistry occurring downwind of convection and the rate at which air in the UT is recycled, previously only the province of model analyses. These results provide quantitative measures that can be used to evaluate global climate and chemistry models.
Volume versus surface-mediated recombination in anatase TiO2 nanoparticles
NASA Astrophysics Data System (ADS)
Cavigli, Lucia; Bogani, Franco; Vinattieri, Anna; Faso, Valentina; Baldi, Giovanni
2009-09-01
We present an experimental study of the radiative recombination dynamics in size-controlled anatase TiO2 nanoparticles in the range 20-130 nm. From time-integrated photoluminescence spectra and picosecond time-resolved experiments as a function of the nanoparticle size, excitation density, and temperature, we show that photoluminescence comes out from a bulk and a surface radiative recombination. The spectral shift and the different time dynamics provide a clear distinction between them. Moreover, the intrinsic nature of the emission is also proven, providing a quantitative evaluation of volume and surface contributions.
Kim, Song Soo; Lee, Ho Yun; Nevrekar, Dipti V.; Forssen, Anna V.; Crapo, James D.; Schroeder, Joyce D.; Lynch, David A.
2013-01-01
Purpose: To provide a new detailed visual assessment scheme of computed tomography (CT) for chronic obstructive pulmonary disease (COPD) by using standard reference images and to compare this visual assessment method with quantitative CT and several physiologic parameters. Materials and Methods: This research was approved by the institutional review board of each institution. CT images of 200 participants in the COPDGene study were evaluated. Four thoracic radiologists performed independent, lobar analysis of volumetric CT images for type (centrilobular, panlobular, and mixed) and extent (on a six-point scale) of emphysema, the presence of bronchiectasis, airway wall thickening, and tracheal abnormalities. Standard images for each finding, generated by two radiologists, were used for reference. The extent of emphysema, airway wall thickening, and luminal area were quantified at the lobar level by using commercial software. Spearman rank test and simple and multiple regression analyses were performed to compare the results of visual assessment with physiologic and quantitative parameters. Results: The type of emphysema, determined by four readers, showed good agreement (κ = 0.63). The extent of the emphysema in each lobe showed good agreement (mean weighted κ = 0.70) and correlated with findings at quantitative CT (r = 0.75), forced expiratory volume in 1 second (FEV1) (r = −0.68), FEV1/forced vital capacity (FVC) ratio (r = −0.74) (P < .001). Agreement for airway wall thickening was fair (mean κ = 0.41), and the number of lobes with thickened bronchial walls correlated with FEV1 (r = −0.60) and FEV1/FVC ratio (r = −0.60) (P < .001). Conclusion: Visual assessment of emphysema and airways disease in individuals with COPD can provide reproducible, physiologically substantial information that may complement that provided by quantitative CT assessment. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12120385/-/DC1 PMID:23220894
Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.
Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun
2016-12-01
To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.
Electron-Beam Diagnostic Methods for Hypersonic Flow Diagnostics
NASA Technical Reports Server (NTRS)
1994-01-01
The purpose of this work was the evaluation of the use of electron-bean fluorescence for flow measurements during hypersonic flight. Both analytical and numerical models were developed in this investigation to evaluate quantitatively flow field imaging concepts based upon the electron beam fluorescence technique for use in flight research and wind tunnel applications. Specific models were developed for: (1) fluorescence excitation/emission for nitrogen, (2) rotational fluorescence spectrum for nitrogen, (3) single and multiple scattering of electrons in a variable density medium, (4) spatial and spectral distribution of fluorescence, (5) measurement of rotational temperature and density, (6) optical filter design for fluorescence imaging, and (7) temperature accuracy and signal acquisition time requirements. Application of these models to a typical hypersonic wind tunnel flow is presented. In particular, the capability of simulating the fluorescence resulting from electron impact ionization in a variable density nitrogen or air flow provides the capability to evaluate the design of imaging instruments for flow field mapping. The result of this analysis is a recommendation that quantitative measurements of hypersonic flow fields using electron-bean fluorescence is a tractable method with electron beam energies of 100 keV. With lower electron energies, electron scattering increases with significant beam divergence which makes quantitative imaging difficult. The potential application of the analytical and numerical models developed in this work is in the design of a flow field imaging instrument for use in hypersonic wind tunnels or onboard a flight research vehicle.
Benchmarking on the evaluation of major accident-related risk assessment.
Fabbri, Luciano; Contini, Sergio
2009-03-15
This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union.
Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo
2016-08-01
Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.
The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations
NASA Astrophysics Data System (ADS)
Matzie, Regis A.
2007-03-01
The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.
Badran, Hani; Pluye, Pierre; Grad, Roland
2017-03-14
The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.
Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?
Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C
2017-09-01
Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.
Clinical Utility of Quantitative Gleason Grading in Prostate Biopsies and Prostatectomy Specimens.
Sauter, Guido; Steurer, Stefan; Clauditz, Till Sebastian; Krech, Till; Wittmer, Corinna; Lutz, Florian; Lennartz, Maximilian; Janssen, Tim; Hakimi, Nayira; Simon, Ronald; von Petersdorff-Campen, Mareike; Jacobsen, Frank; von Loga, Katharina; Wilczak, Waldemar; Minner, Sarah; Tsourlakis, Maria Christina; Chirico, Viktoria; Haese, Alexander; Heinzer, Hans; Beyer, Burkhard; Graefen, Markus; Michl, Uwe; Salomon, Georg; Steuber, Thomas; Budäus, Lars Henrik; Hekeler, Elena; Malsy-Mink, Julia; Kutzera, Sven; Fraune, Christoph; Göbel, Cosima; Huland, Hartwig; Schlomm, Thorsten
2016-04-01
Gleason grading is the strongest prognostic parameter in prostate cancer. Gleason grading is categorized as Gleason ≤ 6, 3 + 4, 4 + 3, 8, and 9-10, but there is variability within these subgroups. For example, Gleason 4 components may range from 5-45% in a Gleason 3 + 4 = 7 cancer. To assess the clinical relevance of the fractions of Gleason patterns. Prostatectomy specimens from 12823 consecutive patients and of 2971 matched preoperative biopsies for which clinical data with an annual follow-up between 2005 and 2014 were available from the Martini-Klinik database. To evaluate the utility of quantitative grading, the fraction of Gleason 3, 4, and 5 patterns seen in biopsies and prostatectomies were recorded. Gleason grade fractions were compared with prostatectomy findings and prostate-specific antigen recurrence. Our data suggest a striking utility of quantitative Gleason grading. In prostatectomy specimens, there was a continuous increase of the risk of prostate-specific antigen recurrence with increasing percentage of Gleason 4 fractions with remarkably small differences in outcome at clinically important thresholds (0% vs 5%; 40% vs 60% Gleason 4), distinguishing traditionally established prognostic groups. Also, in biopsies, the quantitative Gleason scoring identified various intermediate risk groups with respect to Gleason findings in corresponding prostatectomies. Quantitative grading may also reduce the clinical impact of interobserver variability because borderline findings such as tumors with 5%, 40%, or 60% Gleason 4 fractions and very small Gleason 5 fractions (with pivotal impact on the Gleason score) are disclaimed. Quantitative Gleason pattern data should routinely be provided in addition to Gleason score categories, both in biopsies and in prostatectomy specimens. Gleason score is the most important prognostic parameter in prostate cancer, but prone to interobserver variation. The results of our study show that morphological aspects that define the Gleason grade in prostate cancer represent a continuum. Quantitation of Gleason patterns provides clinically relevant information beyond the traditional Gleason grading categories ≤ 3 + 3, 3 + 4, 4 + 3, 8, 9 -1 0. Quantitative Gleason scoring can help to minimize variations between different pathologists and substantially aid in optimized therapy decision-making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.
Gao, Yi; Bouix, Sylvain
2016-05-01
Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.
Marini, N; Bevilacqua, C B; Büttow, M V; Raseira, M C B; Bonow, S
2017-05-25
Selecting and validating reference genes are the first steps in studying gene expression by reverse transcriptase-quantitative polymerase chain reaction (RT-qPCR). The present study aimed to evaluate the stability of five reference genes for the purpose of normalization when studying gene expression in various cultivars of Prunus persica with different chilling requirements. Flower bud tissues of nine peach genotypes from Embrapa's peach breeding program with different chilling requirements were used, and five candidate reference genes based on the RT-qPCR that were useful for studying the relative quantitative gene expression and stability were evaluated using geNorm, NormFinder, and bestKeeper software packages. The results indicated that among the genes tested, the most stable genes to be used as reference genes are Act and UBQ10. This study is the first survey of the stability of reference genes in peaches under chilling stress and provides guidelines for more accurate RT-qPCR results.
Zhao, Ke; Ji, Yaoyao; Li, Yan; Li, Ting
2018-01-21
Near-infrared spectroscopy (NIRS) has become widely accepted as a valuable tool for noninvasively monitoring hemodynamics for clinical and diagnostic purposes. Baseline shift has attracted great attention in the field, but there has been little quantitative study on baseline removal. Here, we aimed to study the baseline characteristics of an in-house-built portable medical NIRS device over a long time (>3.5 h). We found that the measured baselines all formed perfect polynomial functions on phantom tests mimicking human bodies, which were identified by recent NIRS studies. More importantly, our study shows that the fourth-order polynomial function acted to distinguish performance with stable and low-computation-burden fitting calibration (R-square >0.99 for all probes) among second- to sixth-order polynomials, evaluated by the parameters R-square, sum of squares due to error, and residual. This study provides a straightforward, efficient, and quantitatively evaluated solution for online baseline removal for hemodynamic monitoring using NIRS devices.
Boydell, K M; Everett, B
1992-01-01
Supported housing (as distinct from supportive housing) emphasizes the values of consumer choice; independence; participation; permanence; normalcy; and flexible, ongoing supports. As a model, it has only recently become popular in the literature and therefore little is known of its effectiveness in serving people with long-term psychiatric backgrounds. In 1989, Homeward Projects, a community mental health agency located in Metropolitan Toronto, established a supported housing project. Homeward included an evaluative component in its program from the outset. In order to give equal weight to the tenants' opinions, both quantitative and qualitative methodologies were employed. In the quantitative component, residential milieu, social support, and service delivery were examined. The qualitative component involved an ethnographic study which allowed the tenants to voice their experiences of living in such a setting. Results provided a rich understanding of the model. Overall, the tenants eventually came to describe their house as a home.
Cai, Xiang; Shen, Liguo; Zhang, Meijia; Chen, Jianrong; Hong, Huachang; Lin, Hongjun
2017-11-01
Quantitatively evaluating interaction energy between two randomly rough surfaces is the prerequisite to quantitatively understand and control membrane fouling in membrane bioreactors (MBRs). In this study, a new unified approach to construct rough topographies and to quantify interaction energy between a randomly rough particle and a randomly rough membrane was proposed. It was found that, natural rough topographies of both foulants and membrane could be well constructed by a modified two-variable Weierstrass-Mandelbrot (WM) function included in fractal theory. Spatial differential relationships between two constructed surfaces were accordingly established. Thereafter, a new approach combining these relationships, surface element integration (SEI) approach and composite Simpson's rule was deduced to calculate the interaction energy between two randomly rough surfaces in a submerged MBR. The obtained results indicate the profound effects of surface morphology on interaction energy and membrane fouling. This study provided a basic approach to investigate membrane fouling and interface behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-17
... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...