Sample records for quantitatively evaluated based

  1. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  2. Brain Injury Lesion Imaging Using Preconditioned Quantitative Susceptibility Mapping without Skull Stripping.

    PubMed

    Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y

    2018-04-01

    Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping; they were worse on preconditioned quantitative susceptibility mapping. Preconditioned quantitative susceptibility mapping MR imaging can bring the benefits of quantitative susceptibility mapping imaging to clinical practice without the limitations of mask-based quantitative susceptibility mapping, especially for evaluating cerebral microhemorrhage-associated pathologies, such as traumatic brain injury. © 2018 by American Journal of Neuroradiology.

  3. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  4. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  5. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  6. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  8. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  9. Evaluating Inquiry-Based Learning as a Means to Advance Individual Student Achievement

    ERIC Educational Resources Information Center

    Ziemer, Cherilyn G.

    2013-01-01

    Although inquiry-based learning has been debated throughout the greater educational community and demonstrated with some effect in modern classrooms, little quantitative analysis has been performed to empirically validate sustained benefits. This quantitative study focused on whether inquiry-based pedagogy actually brought about sustained and…

  10. Pansharpening on the Narrow Vnir and SWIR Spectral Bands of SENTINEL-2

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.; Karantzalos, K.

    2016-06-01

    In this paper results from the evaluation of several state-of-the-art pansharpening techniques are presented for the VNIR and SWIR bands of Sentinel-2. A procedure for the pansharpening is also proposed which aims at respecting the closest spectral similarities between the higher and lower resolution bands. The evaluation included 21 different fusion algorithms and three evaluation frameworks based both on standard quantitative image similarity indexes and qualitative evaluation from remote sensing experts. The overall analysis of the evaluation results indicated that remote sensing experts disagreed with the outcomes and method ranking from the quantitative assessment. The employed image quality similarity indexes and quantitative evaluation framework based on both high and reduced resolution data from the literature didn't manage to highlight/evaluate mainly the spatial information that was injected to the lower resolution images. Regarding the SWIR bands none of the methods managed to deliver significantly better results than a standard bicubic interpolation on the original low resolution bands.

  11. Classification of cassava genotypes based on qualitative and quantitative data.

    PubMed

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  12. Longitudinal studies of the 18F-FDG kinetics after ipilimumab treatment in metastatic melanoma patients based on dynamic FDG PET/CT.

    PubMed

    Sachpekidis, Christos; Anwar, Hoda; Winkler, Julia K; Kopp-Schneider, Annette; Larribere, Lionel; Haberkorn, Uwe; Hassel, Jessica C; Dimitrakopoulou-Strauss, Antonia

    2018-06-05

    Immunotherapy has raised the issue of appropriate treatment response evaluation, due to the unique mechanism of action of the immunotherapeutic agents. Aim of this analysis is to evaluate the potential role of quantitative analysis of 2-deoxy-2-( 18 F)fluoro-D-glucose ( 18 F-FDG) positron emission tomography/computed tomography (PET/CT) data in monitoring of patients with metastatic melanoma undergoing ipilimumab therapy. 25 patients with unresectable metastatic melanoma underwent dynamic PET/CT (dPET/CT) of the thorax and upper abdomen as well as static, whole body PET/CT with 18 F-FDG before the start of ipilimumab treatment (baseline PET/CT), after two cycles of treatment (interim PET/CT) and at the end of treatment after four cycles (late PET/CT). The evaluation of dPET/CT studies was based on semi-quantitative (standardized uptake value, SUV) calculation as well as quantitative analysis, based on two-tissue compartment modeling and a fractal approach. Patients' best clinical response, assessed at a mean of 59 weeks, was used as reference. According to their best clinical response, patients were dichotomized in those demonstrating clinical benefit (CB, n = 16 patients) and those demonstrating no clinical benefit (no-CB, n = 9 patients). No statistically significant differences were observed between CB and no-CB regarding either semi-quantitative or quantitative parameters in all scans. On contrary, the application of the recently introduced PET response evaluation criteria for immunotherapy (PERCIMT) led to a correct classification rate of 84% (21/25 patients). Quantitative analysis of 18 F-FDG PET data does not provide additional information in treatment response evaluation of metastatic melanoma patients receiving ipilimumab. PERCIMT criteria correlated better with clinical response.

  13. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  15. Agent-based modeling as a tool for program design and evaluation.

    PubMed

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  17. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b

  18. A Quantitative Corpus-Based Approach to English Spatial Particles: Conceptual Symmetry and Its Pedagogical Implications

    ERIC Educational Resources Information Center

    Chen, Alvin Cheng-Hsien

    2014-01-01

    The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…

  19. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  20. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    PubMed

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  1. Quantitative Residual Strain Analyses on Strain Hardened Nickel Based Alloy

    NASA Astrophysics Data System (ADS)

    Yonezawa, Toshio; Maeguchi, Takaharu; Goto, Toru; Juan, Hou

    Many papers have reported about the effects of strain hardening by cold rolling, grinding, welding, etc. on stress corrosion cracking susceptibility of nickel based alloys and austenitic stainless steels for LWR pipings and components. But, the residual strain value due to cold rolling, grinding, welding, etc. is not so quantitatively evaluated.

  2. Nondestructive evaluation of degradation in papaya fruit using intensity based algorithms

    NASA Astrophysics Data System (ADS)

    Kumari, Shubhashri; Nirala, Anil Kumar

    2018-05-01

    In the proposed work degradation in Papaya fruit has been evaluated nondestructively using laser biospeckle technique. The biospeckle activity inside the fruit has been evaluated qualitatively and quantitatively during its maturity to degradation stage using intensity based algorithms. Co-occurrence matrix (COM) has been used for qualitative analysis whereas Inertia Moment (IM), Absolute value Difference (AVD) and Autocovariance methods have been used for quantitative analysis. The biospeckle activity has been found to first increase and then decrease during study period of five days. In addition Granulometric size distribution (GSD) has also been used for the first time for the evaluation of degradation of the papaya. It is concluded that the degradation process of papaya fruit can be evaluated nondestructively using all the mentioned algorithms.

  3. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  4. Evaluation of a real-time quantitative PCR method with propidium monazide treatment for analyses of viable fecal indicator bacteria in wastewater samples

    EPA Science Inventory

    The U.S. EPA is currently evaluating rapid, real-time quantitative PCR (qPCR) methods for determining recreational water quality based on measurements of fecal indicator bacteria DNA sequences. In order to potentially use qPCR for other Clean Water Act needs, such as updating cri...

  5. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  6. Acoustics based assessment of respiratory diseases using GMM classification.

    PubMed

    Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J

    2010-01-01

    The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.

  7. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  8. Evaluating a Dutch cardiology primary care plus intervention on the Triple Aim outcomes: study design of a practice-based quantitative and qualitative research.

    PubMed

    Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2017-09-06

    In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study will evaluate a cardiology PC+ centre using quantitative and supplementary qualitative methods. The findings of both sub-studies will fill a gap in knowledge about the effects of PC+ and in particular whether PC+ is able to pursue the Triple Aim outcomes. NTR6629 (Data registered: 25-08-2017) (registered retrospectively).

  9. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    PubMed Central

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  10. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China.

    PubMed

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-04-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.

  11. Evaluation of Virtual Laboratory Package on Nigerian Secondary School Physics Concepts

    ERIC Educational Resources Information Center

    Falode, Oluwole Caleb; Gambari, Amosa Isiaka

    2017-01-01

    The study evaluated accessibility, flexibility, cost and learning effectiveness of researchers-developed virtual laboratory package for Nigerian secondary school physics. Based on these issues, four research questions were raised and answered. The study was a quantitative-based evaluation research. Sample for the study included 24 physics…

  12. Quantitative non-destructive evaluation of composite materials based on ultrasonic parameters

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1987-01-01

    Research into the nondestructive evaluation of advanced reinforced composite laminates is summarized. The applicability of the Framers-Kronig equations to the nondestructive evaluation of composite materials is described.

  13. Quantitative Evaluation of a Planetary Renderer for Terrain Relative Navigation

    NASA Astrophysics Data System (ADS)

    Amoroso, E.; Jones, H.; Otten, N.; Wettergreen, D.; Whittaker, W.

    2016-11-01

    A ray-tracing computer renderer tool is presented based on LOLA and LROC elevation models and is quantitatively compared to LRO WAC and NAC images for photometric accuracy. We investigated using rendered images for terrain relative navigation.

  14. Resident dashboards: helping your clinical competency committee visualize trainees' key performance indicators.

    PubMed

    Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima

    2016-01-01

    Introduction Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Methods Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Results Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. Conclusions The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.

  15. Resident dashboards: helping your clinical competency committee visualize trainees' key performance indicators.

    PubMed

    Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima

    2016-01-01

    Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.

  16. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer.

    PubMed

    Miyaki, Rie; Yoshida, Shigeto; Tanaka, Shinji; Kominami, Yoko; Sanomura, Yoji; Matsuo, Taiji; Oka, Shiro; Raytchev, Bisser; Tamaki, Toru; Koide, Tetsushi; Kaneda, Kazufumi; Yoshihara, Masaharu; Chayama, Kazuaki

    2015-02-01

    To evaluate the usefulness of a newly devised computer system for use with laser-based endoscopy in differentiating between early gastric cancer, reddened lesions, and surrounding tissue. Narrow-band imaging based on laser light illumination has come into recent use. We devised a support vector machine (SVM)-based analysis system to be used with the newly devised endoscopy system to quantitatively identify gastric cancer on images obtained by magnifying endoscopy with blue-laser imaging (BLI). We evaluated the usefulness of the computer system in combination with the new endoscopy system. We evaluated the system as applied to 100 consecutive early gastric cancers in 95 patients examined by BLI magnification at Hiroshima University Hospital. We produced a set of images from the 100 early gastric cancers; 40 flat or slightly depressed, small, reddened lesions; and surrounding tissues, and we attempted to identify gastric cancer, reddened lesions, and surrounding tissue quantitatively. The average SVM output value was 0.846 ± 0.220 for cancerous lesions, 0.381 ± 0.349 for reddened lesions, and 0.219 ± 0.277 for surrounding tissue, with the SVM output value for cancerous lesions being significantly greater than that for reddened lesions or surrounding tissue. The average SVM output value for differentiated-type cancer was 0.840 ± 0.207 and for undifferentiated-type cancer was 0.865 ± 0.259. Although further development is needed, we conclude that our computer-based analysis system used with BLI will identify gastric cancers quantitatively.

  17. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  18. Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: a systematic map.

    PubMed

    Daigneault, Pierre-Marc

    2014-08-01

    Stakeholder participation and evaluation use have attracted a lot of attention from practitioners, theorists and researchers. A common hypothesis is that participation is positively associated with evaluation use. Whereas the number of empirical studies conducted on this topic is impressive, quantitative research has held a minority position within this scientific production. This study mobilizes systematic review methods to 'map' the empirical literature that has quantitatively studied participation and use. The goal is to take stock and assess the strength of evidence of this literature (but not to synthesize the findings) and, based on this assessment, to provide directions for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  20. Teaching Single-Case Evaluation to Graduate Social Work Students: A Replication

    ERIC Educational Resources Information Center

    Wong, Stephen E.; O'Driscoll, Janice

    2017-01-01

    A course teaching graduate social work students to use an evidence-based model and to evaluate their own practice was replicated and evaluated. Students conducted a project in which they reviewed published research to achieve a clinical goal, applied quantitative measures for ongoing assessment, implemented evidence-based interventions, and…

  1. Evaluation of Multimedia Authoring Instruction Based in a Behaviorist-Cognitive-Constructivist Continuum.

    ERIC Educational Resources Information Center

    Sherry, Annette C.

    1998-01-01

    This evaluative case study examines the learning experiences of graduate students studying effective multimedia authoring. Continuum-based instructional design, behaviorism, cognitivism, constructivism, collaboration, the role of a matrix, transfer of training, and qualitative and quantitative results are discussed. (LRW)

  2. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  3. Test of Antifibrotic Drugs in a Cellular Model of Fibrosis Based on Muscle-Derived Fibroblasts from Duchenne Muscular Dystrophy Patients.

    PubMed

    Zanotti, Simona; Mora, Marina

    2018-01-01

    An in vitro model of muscle fibrosis, based on the use of primary human fibroblasts isolated from muscle biopsies of patients affected by Duchenne muscular dystrophies (DMD) and cultivated in monolayer and 3D conditions, is used to test the potential antifibrotic activity of pirfenidone (PFD). This in vitro model may be usefully also to evaluate the toxicity and efficacy of other candidate molecules for the treatment of fibrosis. The drug toxicity is evaluated using a colorimetric assay based on the conversion of tetrazolium salt (MTT) to insoluble formazan, while the effect of the drug on cell proliferation is measured with the bromodeoxyuridine incorporation assay. The efficacy of the drug is evaluated in fibroblast monolayers by quantitating synthesis and deposition of intracellular collagen with a spectrophotometric picrosirius red-based assay, and by quantitating cell migration using a "scratch" assay. The efficacy of PFD as antifibrotic drug is also evaluated in a 3D fibroblast model by measuring diameters and number of nodules.

  4. LC–MS/MS Quantitation of Esophagus Disease Blood Serum Glycoproteins by Enrichment with Hydrazide Chemistry and Lectin Affinity Chromatography

    PubMed Central

    2015-01-01

    Changes in glycosylation have been shown to have a profound correlation with development/malignancy in many cancer types. Currently, two major enrichment techniques have been widely applied in glycoproteomics, namely, lectin affinity chromatography (LAC)-based and hydrazide chemistry (HC)-based enrichments. Here we report the LC–MS/MS quantitative analyses of human blood serum glycoproteins and glycopeptides associated with esophageal diseases by LAC- and HC-based enrichment. The separate and complementary qualitative and quantitative data analyses of protein glycosylation were performed using both enrichment techniques. Chemometric and statistical evaluations, PCA plots, or ANOVA test, respectively, were employed to determine and confirm candidate cancer-associated glycoprotein/glycopeptide biomarkers. Out of 139, 59 common glycoproteins (42% overlap) were observed in both enrichment techniques. This overlap is very similar to previously published studies. The quantitation and evaluation of significantly changed glycoproteins/glycopeptides are complementary between LAC and HC enrichments. LC–ESI–MS/MS analyses indicated that 7 glycoproteins enriched by LAC and 11 glycoproteins enriched by HC showed significantly different abundances between disease-free and disease cohorts. Multiple reaction monitoring quantitation resulted in 13 glycopeptides by LAC enrichment and 10 glycosylation sites by HC enrichment to be statistically different among disease cohorts. PMID:25134008

  5. Prospects for Public Library Evaluation.

    ERIC Educational Resources Information Center

    Van House, Nancy A.; Childers, Thomas

    1991-01-01

    Discusses methods of evaluation that can be used to measure public library effectiveness, based on a conference sponsored by the Council on Library Resources. Topics discussed include the Public Library Effectiveness Study (PLES), quantitative and qualitative evaluation, using evaluative information for resource acquisition and resource…

  6. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  7. Development and application of a new grey dynamic hierarchy analysis system (GDHAS) for evaluating urban ecological security.

    PubMed

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-05-21

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management.

  8. Development and Application of a New Grey Dynamic Hierarchy Analysis System (GDHAS) for Evaluating Urban Ecological Security

    PubMed Central

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-01-01

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management. PMID:23698700

  9. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  10. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  11. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    PubMed

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  13. Dissociative conceptual and quantitative problem solving outcomes across interactive engagement and traditional format introductory physics

    NASA Astrophysics Data System (ADS)

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-12-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.

  14. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Development and Evaluation of a Multimedia e-Learning Resource for Electrolyte and Acid-Base Disorders

    ERIC Educational Resources Information Center

    Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.

    2011-01-01

    This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…

  16. An Evaluation of High School Curricula Employing Using the Element-Based Curriculum Development Model

    ERIC Educational Resources Information Center

    Aslan, Dolgun; Günay, Rafet

    2016-01-01

    This study was conducted with the aim of evaluating the curricula that constitute the basis of education provision at high schools in Turkey from the perspective of the teachers involved. A descriptive survey model, a quantitative research method was employed in this study. An item-based curriculum evaluation model was employed as part of the…

  17. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  18. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  19. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  20. Behavioral Changes Based on a Course in Agroecology: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Harms, Kristyn; King, James; Francis, Charles

    2009-01-01

    This study evaluated and described student perceptions of a course in agroecology to determine if participants experienced changed perceptions and behaviors resulting from the Agroecosystems Analysis course. A triangulation validating quantitative data mixed methods approach included a written survey comprised of both quantitative and open-ended…

  1. New Statistical Techniques for Evaluating Longitudinal Models.

    ERIC Educational Resources Information Center

    Murray, James R.; Wiley, David E.

    A basic methodological approach in developmental studies is the collection of longitudinal data. Behavioral data cen take at least two forms, qualitative (or discrete) and quantitative. Both types are fallible. Measurement errors can occur in quantitative data and measures of these are based on error variance. Qualitative or discrete data can…

  2. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  3. 76 FR 37620 - Risk-Based Capital Standards: Advanced Capital Adequacy Framework-Basel II; Establishment of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ... systems. E. Quantitative Methods for Comparing Capital Frameworks The NPR sought comment on how the... industry while assessing levels of capital. This commenter points out maintaining reliable comparative data over time could make quantitative methods for this purpose difficult. For example, evaluating asset...

  4. Evaluation of a Community-Based Participatory Research Consortium from the Perspective of Academics and Community Service Providers Focused on Child Health and Well-Being

    ERIC Educational Resources Information Center

    Pivik, Jayne R.; Goelman, Hillel

    2011-01-01

    A process evaluation of a consortium of academic researchers and community-based service providers focused on the health and well-being of children and families provides empirical and practice-based evidence of those factors important for community-based participatory research (CBPR). This study draws on quantitative ratings of 33 factors…

  5. Understanding Acid-Base Concepts: Evaluating the Efficacy of a Senior High School Student-Centred Instructional Program in Indonesia

    ERIC Educational Resources Information Center

    Rahayu, Sri; Chandrasegaran, A. L.; Treagust, David F.; Kita, Masakazu; Ibnu, Suhadi

    2011-01-01

    This study was a mixed quantitative-qualitative research to evaluate the efficacy of a designed student-centred instructional (DSCI) program for teaching about acids and bases. The teaching innovation was designed based on constructivist, hands-on inquiry and context-based approaches and implemented in seven 45-min lessons with a class of 36 grade…

  6. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  7. Visual salience metrics for image inpainting

    NASA Astrophysics Data System (ADS)

    Ardis, Paul A.; Singhal, Amit

    2009-01-01

    Quantitative metrics for successful image inpainting currently do not exist, with researchers instead relying upon qualitative human comparisons to evaluate their methodologies and techniques. In an attempt to rectify this situation, we propose two new metrics to capture the notions of noticeability and visual intent in order to evaluate inpainting results. The proposed metrics use a quantitative measure of visual salience based upon a computational model of human visual attention. We demonstrate how these two metrics repeatably correlate with qualitative opinion in a human observer study, correctly identify the optimum uses for exemplar-based inpainting (as specified in the original publication), and match qualitative opinion in published examples.

  8. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  9. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  10. A Database Evaluation Based on Information Needs of Academic Social Scientists.

    ERIC Educational Resources Information Center

    Buterbaugh, Nancy Toth

    This study evaluates two databases, "Historical Abstracts" and REESWeb, to determine their effectiveness in supporting academic social science research. While many performance evaluations gather quantitative data from isolated query and response transactions, this study is a qualitative evaluation of the databases in the context of…

  11. Consequences of No Child Left Behind on Evaluation Purpose, Design, and Impact

    ERIC Educational Resources Information Center

    Mabry, Linda

    2008-01-01

    As an outgrowth of No Child Left Behind's narrow definition of scientifically based research, the priority given to certain quantitative evaluation designs has sparked debate among those in the evaluation community. Federal mandates for particular evaluation methodologies run counter to evaluation practice and to the direction of most evaluation…

  12. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  13. 77 FR 75167 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    .... This allows us to proceed with confidence in the method, the contractor, and the survey design. The... a 30-minute, web-based survey. Data from the survey will then be quantitatively evaluated to... involved with GYT. This evaluation study will rely on a Web-based survey to be self- administered at home...

  14. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  15. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  16. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  17. Conflicts Management Model in School: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…

  18. Recidivism, Disciplinary History, and Institutional Adjustment: A Quantitative Study Examining Correctional Education Programs

    ERIC Educational Resources Information Center

    Flamer, Eric, Sr.

    2012-01-01

    Establishing college-degree programs for prison inmates is an evidence-based effective instructional strategy in reducing recidivism. Evaluating academic arenas as a resource to improve behavior and levels of functioning within correctional facilities is a necessary component of inmate academic programs. The purpose of this quantitative,…

  19. Statistical thermodynamics unveils the dissolution mechanism of cellobiose.

    PubMed

    Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi

    2017-08-30

    In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.

  20. Systematic assessment of survey scan and MS2-based abundance strategies for label-free quantitative proteomics using high-resolution MS data.

    PubMed

    Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun

    2014-04-04

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.

  1. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  2. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  3. Analytical insight into "breathing" crack-induced acoustic nonlinearity with an application to quantitative evaluation of contact cracks.

    PubMed

    Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng

    2018-08-01

    To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. An Evaluation of the Private High School Curriculum in Turkey

    ERIC Educational Resources Information Center

    Aslan, Dolgun

    2016-01-01

    This study aims at evaluating curricula of private high schools in line with opinions of teachers working at the related high schools, and identifying any related problems. Screening model is used as a quantitative research method in the study. The "element-based curriculum evaluation model" is taken as basis for evaluation of the…

  5. Performance Evaluation of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit: Comparison with the Roche COBAS® AmpliPrep/COBAS TaqMan® HIV-1 Test Ver.2.0 for Quantification of HIV-1 Viral Load in Indonesia.

    PubMed

    Kosasih, Agus Susanto; Sugiarto, Christine; Hayuanta, Hubertus Hosti; Juhaendi, Runingsih; Setiawan, Lyana

    2017-08-08

    Measurement of viral load in human immunodeficiency virus type 1 (HIV-1) infected patients is essential for the establishment of a therapeutic strategy. Several assays based on qPCR are available for the measurement of viral load; they differ in sample volume, technology applied, target gene, sensitivity and dynamic range. The Bioneer AccuPower® HIV-1 Quantitative RT-PCR is a novel commercial kit that has not been evaluated for its performance. This study aimed to evaluate the performance of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit. In total, 288 EDTA plasma samples from the Dharmais Cancer Hospital were analyzed with the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit and the Roche COBAS? AmpliPrep/COBAS® TaqMan® HIV-1 version 2.0 (CAP/CTM v2.0). The performance of the Bioneer assay was then evaluated against the Roche CAP/CTM v2.0. Overall, there was good agreement between the two assays. The Bioneer assay showed significant linear correlation with CAP/CTM v2.0 (R2=0.963, p<0.001) for all samples (N=118) which were quantified by both assays, with high agreement (94.9%, 112/118) according to the Bland-Altman model. The mean difference between the quantitative values measured by Bioneer assay and CAP/CTM v2.0 was 0.11 Log10 IU/mL (SD=0.26). Based on these results, the Bioneer assay can be used to quantify HIV-1 RNA in clinical laboratories.

  6. Three-phase bone scintigraphy for diagnosis of Charcot neuropathic osteoarthropathy in the diabetic foot - does quantitative data improve diagnostic value?

    PubMed

    Fosbøl, M; Reving, S; Petersen, E H; Rossing, P; Lajer, M; Zerahn, B

    2017-01-01

    To investigate whether inclusion of quantitative data on blood flow distribution compared with visual qualitative evaluation improve the reliability and diagnostic performance of 99 m Tc-hydroxymethylene diphosphate three-phase bone scintigraphy (TPBS) in patients suspected for charcot neuropathic osteoarthropathy (CNO) of the foot. A retrospective cohort study of TPBS performed on 148 patients with suspected acute CNO referred from a single specialized diabetes care centre. The quantitative blood flow distribution was calculated based on the method described by Deutsch et al. All scintigraphies were re-evaluated by independent, blinded observers twice with and without quantitative data on blood flow distribution at ankle and focus level, respectively. The diagnostic validity of TPBS was determined by subsequent review of clinical data and radiological examinations. A total of 90 patients (61%) had confirmed diagnosis of CNO. The sensitivity, specificity and accuracy of three-phase bone scintigraphy without/with quantitative data were 89%/88%, 58%/62% and 77%/78%, respectively. The intra-observer agreement improved significantly by adding quantitative data in the evaluation (Kappa value 0·79/0·94). The interobserver agreement was not significantly improved. Adding quantitative data on blood flow distribution in the interpretation of TBPS improves intra-observer variation, whereas no difference in interobserver variation was observed. The sensitivity of TPBS in the diagnosis of CNO is high, but holds limited specificity. Diagnostic performance does not improve using quantitative data in the evaluation. This may be due to the reference intervals applied in the study or the absence of a proper gold standard diagnostic procedure for comparison. © 2015 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  7. Comparison of task-based exposure metrics for an epidemiologic study of isocyanate inhalation exposures among autobody shop workers.

    PubMed

    Woskie, Susan R; Bello, Dhimiter; Gore, Rebecca J; Stowe, Meredith H; Eisen, Ellen A; Liu, Youcheng; Sparer, Judy A; Redlich, Carrie A; Cullen, Mark R

    2008-09-01

    Because many occupational epidemiologic studies use exposure surrogates rather than quantitative exposure metrics, the UMass Lowell and Yale study of autobody shop workers provided an opportunity to evaluate the relative utility of surrogates and quantitative exposure metrics in an exposure response analysis of cross-week change in respiratory function. A task-based exposure assessment was used to develop several metrics of inhalation exposure to isocyanates. The metrics included the surrogates, job title, counts of spray painting events during the day, counts of spray and bystander exposure events, and a quantitative exposure metric that incorporated exposure determinant models based on task sampling and a personal workplace protection factor for respirator use, combined with a daily task checklist. The result of the quantitative exposure algorithm was an estimate of the daily time-weighted average respirator-corrected total NCO exposure (microg/m(3)). In general, these four metrics were found to be variable in agreement using measures such as weighted kappa and Spearman correlation. A logistic model for 10% drop in FEV(1) from Monday morning to Thursday morning was used to evaluate the utility of each exposure metric. The quantitative exposure metric was the most favorable, producing the best model fit, as well as the greatest strength and magnitude of association. This finding supports the reports of others that reducing exposure misclassification can improve risk estimates that otherwise would be biased toward the null. Although detailed and quantitative exposure assessment can be more time consuming and costly, it can improve exposure-disease evaluations and is more useful for risk assessment purposes. The task-based exposure modeling method successfully produced estimates of daily time-weighted average exposures in the complex and changing autobody shop work environment. The ambient TWA exposures of all of the office workers and technicians and 57% of the painters were found to be below the current U.K. Health and Safety Executive occupational exposure limit (OEL) for total NCO of 20 microg/m(3). When respirator use was incorporated, all personal daily exposures were below the U.K. OEL.

  8. Relationship between Affective Learning, Instructor Attractiveness and Instructor Evaluation in Videoconference-Based Distance Education Courses

    ERIC Educational Resources Information Center

    Aydin, Irem E.

    2012-01-01

    This paper is intended to reveal the results of a study in which the relationship between learners' perceptions of affective learning, instructors' attractiveness and instructor evaluations in a videoconference based distance education course was investigated. An online survey instrument was used to collect quantitative data. A series of Pearson…

  9. FE-ANN based modeling of 3D Simple Reinforced Concrete Girders for Objective Structural Health Evaluation : Tech Transfer Summary

    DOT National Transportation Integrated Search

    2017-06-01

    The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...

  10. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified for ASTM E 2111-00 were largely associated with the filtration process and counting bacterial colonies on filters. Thus, the TSM was determined to be the most suitable method.

  11. Automated characterization of normal and pathologic lung tissue by topological texture analysis of multidetector CT

    NASA Astrophysics Data System (ADS)

    Boehm, H. F.; Fink, C.; Becker, C.; Reiser, M.

    2007-03-01

    Reliable and accurate methods for objective quantitative assessment of parenchymal alterations in the lung are necessary for diagnosis, treatment and follow-up of pulmonary diseases. Two major types of alterations are pulmonary emphysema and fibrosis, emphysema being characterized by abnormal enlargement of the air spaces distal to the terminal, nonrespiratory bronchiole, accompanied by destructive changes of the alveolar walls. The main characteristic of fibrosis is coursening of the interstitial fibers and compaction of the pulmonary tissue. With the ability to display anatomy free from superimposing structures and greater visual clarity, Multi-Detector-CT has shown to be more sensitive than the chest radiograph in identifying alterations of lung parenchyma. In automated evaluation of pulmonary CT-scans, quantitative image processing techniques are applied for objective evaluation of the data. A number of methods have been proposed in the past, most of which utilize simple densitometric tissue features based on the mean X-ray attenuation coefficients expressed in terms of Hounsfield Units [HU]. Due to partial volume effects, most of the density-based methodologies tend to fail, namely in cases, where emphysema and fibrosis occur within narrow spatial limits. In this study, we propose a methodology based upon the topological assessment of graylevel distribution in the 3D image data of lung tissue which provides a way of improving quantitative CT evaluation. Results are compared to the more established density-based methods.

  12. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  13. Quantitative electrophysiological monitoring of anti-histamine drug effects on live cells via reusable sensor platforms.

    PubMed

    Pham Ba, Viet Anh; Cho, Dong-Guk; Kim, Daesan; Yoo, Haneul; Ta, Van-Thao; Hong, Seunghun

    2017-08-15

    We demonstrated the quantitative electrophysiological monitoring of histamine and anti-histamine drug effects on live cells via reusable sensor platforms based on carbon nanotube transistors. This method enabled us to monitor the real-time electrophysiological responses of a single HeLa cell to histamine with different concentrations. The measured electrophysiological responses were attributed to the activity of histamine type 1 receptors on a HeLa cell membrane by histamine. Furthermore, the effects of anti-histamine drugs such as cetirizine or chlorphenamine on the electrophysiological activities of HeLa cells were also evaluated quantitatively. Significantly, we utilized only a single device to monitor the responses of multiple HeLa cells to each drug, which allowed us to quantitatively analyze the antihistamine drug effects on live cells without errors from the device-to-device variation in device characteristics. Such quantitative evaluation capability of our method would promise versatile applications such as drug screening and nanoscale bio sensor researches. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.

  15. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  16. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  17. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  18. Are we missing the boat? Current uses of long-term biological monitoring data in the evaluation and management of marine protected areas.

    PubMed

    Addison, P F E; Flander, L B; Cook, C N

    2015-02-01

    Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Forage resource evaluation system for habitat—deer: an interactive deer habitat model

    Treesearch

    Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris

    2012-01-01

    We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...

  20. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.

  1. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.

  2. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  4. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  5. Critiquing qualitative research.

    PubMed

    Beck, Cheryl Tatano

    2009-10-01

    The ability to critique research is a valuable skill that is fundamental to a perioperative nurse's ability to base his or her clinical practice on evidence derived from research. Criteria differ for critiquing a quantitative versus a qualitative study (ie, statistics are evaluated in a quantitative study, but not in a qualitative study). This article provides on guidelines for assessing qualitative research. Excerpts from a published qualitative research report are summarized and then critiqued. Questions are provided that help evaluate different sections of a research study (eg, sample, data collection methods, data analysis).

  6. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Chemical purity using quantitative 1H-nuclear magnetic resonance: a hierarchical Bayesian approach for traceable calibrations

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Lippa, Katrice A.

    2016-10-01

    Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.

  8. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  9. Quantitative evaluation of dual-flip-angle T1 mapping on DCE-MRI kinetic parameter estimation in head and neck

    PubMed Central

    Chow, Steven Kwok Keung; Yeung, David Ka Wai; Ahuja, Anil T; King, Ann D

    2012-01-01

    Purpose To quantitatively evaluate the kinetic parameter estimation for head and neck (HN) dynamic contrast-enhanced (DCE) MRI with dual-flip-angle (DFA) T1 mapping. Materials and methods Clinical DCE-MRI datasets of 23 patients with HN tumors were included in this study. T1 maps were generated based on multiple-flip-angle (MFA) method and different DFA combinations. Tofts model parameter maps of kep, Ktrans and vp based on MFA and DFAs were calculated and compared. Fitted parameter by MFA and DFAs were quantitatively evaluated in primary tumor, salivary gland and muscle. Results T1 mapping deviations by DFAs produced remarkable kinetic parameter estimation deviations in head and neck tissues. In particular, the DFA of [2º, 7º] overestimated, while [7º, 12º] and [7º, 15º] underestimated Ktrans and vp, significantly (P<0.01). [2º, 15º] achieved the smallest but still statistically significant overestimation for Ktrans and vp in primary tumors, 32.1% and 16.2% respectively. kep fitting results by DFAs were relatively close to the MFA reference compared to Ktrans and vp. Conclusions T1 deviations induced by DFA could result in significant errors in kinetic parameter estimation, particularly Ktrans and vp, through Tofts model fitting. MFA method should be more reliable and robust for accurate quantitative pharmacokinetic analysis in head and neck. PMID:23289084

  10. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  11. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  12. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  13. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Three-Dimensional Registration for Handheld Profiling Systems Based on Multiple Shot Structured Light

    PubMed Central

    Ayaz, Shirazi Muhammad; Kim, Min Young

    2018-01-01

    In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552

  15. Lecturer's Gender and Their Valuation of Student Evaluation of Teaching

    ERIC Educational Resources Information Center

    Atek, Engku Suhaimi Engku; Salim, Hishamuddin; Halim, Zulazhan Ab.; Jusoh, Zailani; Yusuf, Mohd Ali Mohd

    2015-01-01

    Student evaluation of teaching (SET) is carried out every semester at Malaysian universities and lecturers are evaluated based on student ratings. But very little is researched about what lecturers actually think about SET and whether it serves any meaningful purpose at all. This quantitative study involving six public universities on the East…

  16. Evaluation of a Family-Centred Children's Weight Management Intervention

    ERIC Educational Resources Information Center

    Jinks, Annette; English, Sue; Coufopoulos, Anne

    2013-01-01

    Purpose: The purpose of this paper is to conduct an in-depth quantitative and qualitative evaluation of a family-based weight loss and healthy life style programme for clinically obese children in England. Design/methodology/approach: The mixed method case study evaluation used included obtaining pre and post measurements of anthropometry and a…

  17. On the Effects, Problems, and Countermeasures of Undergraduate Teaching Evaluation in Higher Education

    ERIC Educational Resources Information Center

    Xianjun, Liu; Yang, Yu; Junchao, Zhang; Shuguang, Wei; Ling, Ding

    2016-01-01

    The Undergraduate Teaching Evaluation of General Institutions of Higher Education from 2003 to 2008 was the largest-scale evaluation in Chinese higher education history. It exerted a tremendous influence as a key exploration of quality assurance with Chinese characteristics. Based on existing research, this study combines quantitative and…

  18. Effects of Teacher Evaluation on Teacher Job Satisfaction in Ohio

    ERIC Educational Resources Information Center

    Downing, Pamela R.

    2016-01-01

    The purpose of this quantitative study was to explore whether or not increased accountability measures found in the Ohio Teacher Evaluation System (OTES) impacted teacher job satisfaction. Student growth measures required by the OTES increased teacher accountability. Today, teachers are largely evaluated based on the results of what they do in the…

  19. Clinical study of quantitative diagnosis of early cervical cancer based on the classification of acetowhitening kinetics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.

    2010-03-01

    A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.

  20. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  1. Tendency for interlaboratory precision in the GMO analysis method based on real-time PCR.

    PubMed

    Kodama, Takashi; Kurosawa, Yasunori; Kitta, Kazumi; Naito, Shigehiro

    2010-01-01

    The Horwitz curve estimates interlaboratory precision as a function only of concentration, and is frequently used as a method performance criterion in food analysis with chemical methods. The quantitative biochemical methods based on real-time PCR require an analogous criterion to progressively promote method validation. We analyzed the tendency of precision using a simplex real-time PCR technique in 53 collaborative studies of seven genetically modified (GM) crops. Reproducibility standard deviation (SR) and repeatability standard deviation (Sr) of the genetically modified organism (GMO) amount (%) was more or less independent of GM crops (i.e., maize, soybean, cotton, oilseed rape, potato, sugar beet, and rice) and evaluation procedure steps. Some studies evaluated whole steps consisting of DNA extraction and PCR quantitation, whereas others focused only on the PCR quantitation step by using DNA extraction solutions. Therefore, SR and Sr for GMO amount (%) are functions only of concentration similar to the Horwitz curve. We proposed S(R) = 0.1971C 0.8685 and S(r) = 0.1478C 0.8424, where C is the GMO amount (%). We also proposed a method performance index in GMO quantitative methods that is analogous to the Horwitz Ratio.

  2. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  4. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Lunar-base construction equipment and methods evaluation

    NASA Technical Reports Server (NTRS)

    Boles, Walter W.; Ashley, David B.; Tucker, Richard L.

    1993-01-01

    A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.

  6. Evaluation of a hydrophilic interaction liquid chromatography design space for sugars and sugar alcohols.

    PubMed

    Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S

    2017-03-17

    Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Performance analysis of a film dosimetric quality assurance procedure for IMRT with regard to the employment of quantitative evaluation methods.

    PubMed

    Winkler, Peter; Zurl, Brigitte; Guss, Helmuth; Kindl, Peter; Stuecklschweiger, Georg

    2005-02-21

    A system for dosimetric verification of intensity-modulated radiotherapy (IMRT) treatment plans using absolute calibrated radiographic films is presented. At our institution this verification procedure is performed for all IMRT treatment plans prior to patient irradiation. Therefore clinical treatment plans are transferred to a phantom and recalculated. Composite treatment plans are irradiated to a single film. Film density to absolute dose conversion is performed automatically based on a single calibration film. A software application encompassing film calibration, 2D registration of measurement and calculated distributions, image fusion, and a number of visual and quantitative evaluation utilities was developed. The main topic of this paper is a performance analysis for this quality assurance procedure, with regard to the specification of tolerance levels for quantitative evaluations. Spatial and dosimetric precision and accuracy were determined for the entire procedure, comprising all possible sources of error. The overall dosimetric and spatial measurement uncertainties obtained thereby were 1.9% and 0.8 mm respectively. Based on these results, we specified 5% dose difference and 3 mm distance-to-agreement as our tolerance levels for patient-specific quality assurance for IMRT treatments.

  8. Quantitative non-destructive evaluation of composite materials based on ultrasonic wave propagation

    NASA Technical Reports Server (NTRS)

    Miller, J. G.

    1986-01-01

    The application and interpretation of specific ultrasonic nondestructive evaluation techniques are studied. The Kramers-Kronig or generalized dispersion relationships are applied to nondestructive techniques. Progress was made on an improved determination of material properties of composites inferred from elastic constant measurements.

  9. A Novel Pretreatment-Free Duplex Chamber Digital PCR Detection System for the Absolute Quantitation of GMO Samples.

    PubMed

    Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2016-03-18

    Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems.

  10. A Novel Pretreatment-Free Duplex Chamber Digital PCR Detection System for the Absolute Quantitation of GMO Samples

    PubMed Central

    Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2016-01-01

    Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems. PMID:26999129

  11. [Quantitative classification-based occupational health management for electroplating enterprises in Baoan District of Shenzhen, China].

    PubMed

    Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua

    2014-04-01

    To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.

  12. Is quantitative PCR for the pneumolysin (ply) gene useful for detection of pneumococcal lower respiratory tract infection?

    PubMed

    Abdeldaim, G; Herrmann, B; Korsgaard, J; Olcén, P; Blomberg, J; Strålin, K

    2009-06-01

    The pneumolysin (ply) gene is widely used as a target in PCR assays for Streptococcus pneumoniae in respiratory secretions. However, false-positive results with conventional ply-based PCR have been reported. The aim here was to study the performance of a quantitative ply-based PCR for the identification of pneumococcal lower respiratory tract infection (LRTI). In a prospective study, fibreoptic bronchoscopy was performed in 156 hospitalized adult patients with LRTI and 31 controls who underwent bronchoscopy because of suspicion of malignancy. Among the LRTI patients and controls, the quantitative ply-based PCR applied to bronchoalveolar lavage (BAL) fluid was positive at >or=10(3) genome copies/mL in 61% and 71% of the subjects, at >or=10(5) genome copies/mL in 40% and 58% of the subjects, and at >or=10(7) genome copies/mL in 15% and 3.2% of the subjects, respectively. Using BAL fluid culture, blood culture, and/or a urinary antigen test, S. pneumoniae was identified in 19 LRTI patients. As compared with these diagnostic methods used in combination, quantitative ply-based PCR showed sensitivities and specificities of 89% and 43% at a cut-off of 10(3) genome copies/mL, of 84% and 66% at a cut-off of 10(5) genome copies/mL, and of 53% and 90% at a cut-off of 10(7) genome copies/mL, respectively. In conclusion, a high cut-off with the quantitative ply-based PCR was required to reach acceptable specificity. However, as a high cut-off resulted in low sensitivity, quantitative ply-based PCR does not appear to be clinically useful. Quantitative PCR methods for S. pneumoniae using alternative gene targets should be evaluated.

  13. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    PubMed

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  14. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  15. Quantitative Prediction of Drug–Drug Interactions Involving Inhibitory Metabolites in Drug Development: How Can Physiologically Based Pharmacokinetic Modeling Help?

    PubMed Central

    Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M

    2016-01-01

    This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087

  16. A content analysis of kindergarten-12th grade school-based nutrition interventions: taking advantage of past learning.

    PubMed

    Roseman, Mary G; Riddell, Martha C; Haynes, Jessica N

    2011-01-01

    To review the literature, identifying proposed recommendations for school-based nutrition interventions, and evaluate kindergarten through 12th grade school-based nutrition interventions conducted from 2000-2008. Proposed recommendations from school-based intervention reviews were developed and used in conducting a content analysis of 26 interventions. Twenty-six school-based nutrition interventions in the United States first published in peer-reviewed journals from 2000-2008. VARIABLE MEASURED: Ten proposed recommendations based on prior analyses of school-based nutrition interventions: (1) behaviorally focused, (2) multicomponents, (3) healthful food/school environment, (4) family involvement, (5) self-assessments, (6) quantitative evaluation, (7) community involvement, (8) ethnic/heterogeneous groups, (9) multimedia technology, and (10) sequential and sufficient duration. Descriptive statistics. The most frequent recommendations used were: (1) behaviorally focused components (100%) and (2) quantitative evaluation of food behaviors (96%). Only 15% of the interventions included community involvement or ethnic/heterogeneous groups, whereas 31% included anthropometric measures. Five of the 10 proposed recommendations were included in over 50% of the interventions. Rising trend of overweight children warrants the need to synthesize findings from previous studies to inform research and program development and assist in identification of high-impact strategies and tactics. Copyright © 2011 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.

  17. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  18. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  19. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  20. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  1. Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry

    EPA Science Inventory

    Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...

  2. Training of lay health educators to implement an evidence-based behavioral weight loss intervention in rural senior centers.

    PubMed

    Krukowski, Rebecca A; Lensing, Shelly; Love, Sharhonda; Prewitt, T Elaine; Adams, Becky; Cornell, Carol E; Felix, Holly C; West, Delia

    2013-02-01

    Lay health educators (LHEs) offer great promise for facilitating the translation of evidence-based health promotion programs to underserved areas; yet, there is little guidance on how to train LHEs to implement these programs, particularly in the crucial area of empirically validated obesity interventions. This article describes experiences in recruiting, training, and retaining 20 LHEs who delivered a 12-month evidence-based behavioral lifestyle intervention (based on the Diabetes Prevention Program) in senior centers across a rural state. A mixed method approach was used which incorporated collecting the folllowing: quantitative data on sociodemographic characteristics of LHEs; process data related to training, recruitment, intervention implementation, and retention of LHEs; and a quantitative program evaluation questionnaire, which was supplemented by a qualitative program evaluation questionnaire. Descriptive statistics were calculated for quantitative data, and qualitative data were analyzed using content analysis. The training program was well received, and the LHEs effectively recruited participants and implemented the lifestyle intervention in senior centers following a structured protocol. The methods used in this study produced excellent long-term retention of LHEs and good adherence to intervention protocol, and as such may provide a model that could be effective for others seeking to implement LHE-delivered health promotion programs.

  3. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this combined method in terms of quantitative accuracy using the realistic 3D NCAT phantom and an activity distribution obtained from patient studies. We compared the accuracy of organ activity estimates in images reconstructed with and without addition of downscatter compensation from projections with and without downscatter contamination. Results: We observed that the proposed method provided substantial improvements in accuracy compared to no downscatter compensation and had accuracies comparable to reconstructions from projections without downscatter contamination. Conclusions: The results demonstrate that the proposed model-based downscatter compensation method is effective and may have a role in quantitative 131I imaging. PMID:21815394

  4. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment.

    PubMed

    Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja

    2016-11-01

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.

  5. A simple hemostasis model for the quantitative evaluation of hydrogel-based local hemostatic biomaterials on tissue surface.

    PubMed

    Murakami, Yoshihiko; Yokoyama, Masayuki; Nishida, Hiroshi; Tomizawa, Yasuko; Kurosawa, Hiromi

    2008-09-01

    Several hemostat hydrogels are clinically used, and some other agents are studied for safer, more facile, and more efficient hemostasis. In the present paper, we proposed a novel method to evaluate local hemostat hydrogel on tissue surface. The procedure consisted of the following steps: (step 1) a mouse was fixed on a cork board, and its abdomen was incised; (step 2) serous fluid was carefully removed because it affected the estimation of the weight gained by the filter paper, and parafilm and preweighted filter paper were placed beneath the liver (parafilm prevented the filter paper's absorption of gradually oozing serous fluid); (step 3) the cork board was tilted and maintained at an angle of about 45 degrees so that the bleeding would more easily flow from the liver toward the filter paper; and (step 4) the bleeding lasted for 3 min. In this step, a hemostat was applied to the liver wound immediately after the liver was pricked with a needle. We found that (1) a careful removal of serous fluid prior to a bleeding and (2) a quantitative determination of the amount of excess aqueous solution that oozed out from a hemostat were important to a rigorous evaluation of hemostat efficacy. We successfully evaluated the efficacy of a fibrin-based hemostat hydrogel by using our method. The method proposed in the present study enabled the quantitative, accurate, and easy evaluation of the efficacy of local hemostatic hydrogel which acts as tissue-adhesive agent on biointerfaces.

  6. The Use and Evaluation of Videodiscs in the Chemistry Laboratory.

    ERIC Educational Resources Information Center

    Russell, Arlene A.; And Others

    1985-01-01

    Describes a quantitative evaluation of an interactive videodisc program in which students measure the temperature dependence of the solubility product of lead chloride by titration of chloride with silver nitrate using a Mohr titration. Student reaction (based on responses made using the program, quiz answers, and laboratory performance) was…

  7. Detection of Prostate Cancer: Quantitative Multiparametric MR Imaging Models Developed Using Registered Correlative Histopathology.

    PubMed

    Metzger, Gregory J; Kalavagunta, Chaitanya; Spilseth, Benjamin; Bolan, Patrick J; Li, Xiufeng; Hutter, Diane; Nam, Jung W; Johnson, Andrew D; Henriksen, Jonathan C; Moench, Laura; Konety, Badrinath; Warlick, Christopher A; Schmechel, Stephen C; Koopmeiners, Joseph S

    2016-06-01

    Purpose To develop multiparametric magnetic resonance (MR) imaging models to generate a quantitative, user-independent, voxel-wise composite biomarker score (CBS) for detection of prostate cancer by using coregistered correlative histopathologic results, and to compare performance of CBS-based detection with that of single quantitative MR imaging parameters. Materials and Methods Institutional review board approval and informed consent were obtained. Patients with a diagnosis of prostate cancer underwent multiparametric MR imaging before surgery for treatment. All MR imaging voxels in the prostate were classified as cancer or noncancer on the basis of coregistered histopathologic data. Predictive models were developed by using more than one quantitative MR imaging parameter to generate CBS maps. Model development and evaluation of quantitative MR imaging parameters and CBS were performed separately for the peripheral zone and the whole gland. Model accuracy was evaluated by using the area under the receiver operating characteristic curve (AUC), and confidence intervals were calculated with the bootstrap procedure. The improvement in classification accuracy was evaluated by comparing the AUC for the multiparametric model and the single best-performing quantitative MR imaging parameter at the individual level and in aggregate. Results Quantitative T2, apparent diffusion coefficient (ADC), volume transfer constant (K(trans)), reflux rate constant (kep), and area under the gadolinium concentration curve at 90 seconds (AUGC90) were significantly different between cancer and noncancer voxels (P < .001), with ADC showing the best accuracy (peripheral zone AUC, 0.82; whole gland AUC, 0.74). Four-parameter models demonstrated the best performance in both the peripheral zone (AUC, 0.85; P = .010 vs ADC alone) and whole gland (AUC, 0.77; P = .043 vs ADC alone). Individual-level analysis showed statistically significant improvement in AUC in 82% (23 of 28) and 71% (24 of 34) of patients with peripheral-zone and whole-gland models, respectively, compared with ADC alone. Model-based CBS maps for cancer detection showed improved visualization of cancer location and extent. Conclusion Quantitative multiparametric MR imaging models developed by using coregistered correlative histopathologic data yielded a voxel-wise CBS that outperformed single quantitative MR imaging parameters for detection of prostate cancer, especially when the models were assessed at the individual level. (©) RSNA, 2016 Online supplemental material is available for this article.

  8. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.

    PubMed

    Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier

    2010-01-28

    Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.

  9. The Effect of Radiation on Selected Photographic Film

    NASA Technical Reports Server (NTRS)

    Slater, Richard; Kinard, John; Firsov, Ivan

    2000-01-01

    We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.

  10. Exploring a taxonomy for aggression against women: can it aid conceptual clarity?

    PubMed

    Cook, Sarah; Parrott, Dominic

    2009-01-01

    The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.

  11. Multiple internal standard normalization for improving HS-SPME-GC-MS quantitation in virgin olive oil volatile organic compounds (VOO-VOCs) profile.

    PubMed

    Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca

    2017-04-01

    The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    PubMed Central

    Choi, Kyoungah; Lee, Impyeong

    2015-01-01

    We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909

  13. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  14. High School Students' Understanding of Acid-Base Concepts: An Ongoing Challenge for Teachers

    ERIC Educational Resources Information Center

    Damanhuri, Muhd Ibrahim Muhamad; Treagust, David F.; Won, Mihye; Chandrasegaran, A. L.

    2016-01-01

    Using a quantitative case study design, the "Acids-Bases Chemistry Achievement Test" ("ABCAT") was developed to evaluate the extent to which students in Malaysian secondary schools achieved the intended curriculum on acid-base concepts. Responses were obtained from 260 Form 5 (Grade 11) students from five schools to initially…

  15. Design of an impact evaluation using a mixed methods model--an explanatory assessment of the effects of results-based financing mechanisms on maternal healthcare services in Malawi.

    PubMed

    Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela

    2014-04-22

    In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.

  16. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model

    PubMed Central

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210

  17. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    PubMed

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  18. Note-Taking Evaluation Using Network Illustrations Based on Term Co-Occurrence in a Blended Learning Environment

    ERIC Educational Resources Information Center

    Nakayama, Minoru; Mutsuura, Kouichi; Yamamoto, Hiroh

    2016-01-01

    Note contents taken by students during a blended learning course were evaluated, to improve the quality of university instruction. To conduct a quantitative comparison of the contents of all notes for effective instruction from lecturer to students to occur, the contents were mathematically compared and evaluated using two ways of summarizing the…

  19. How Teacher Evaluation Methods Matter for Accountability: A Comparative Analysis of Teacher Effectiveness Ratings by Principals and Teacher Value-Added Measures

    ERIC Educational Resources Information Center

    Harris, Douglas N.; Ingle, William K.; Rutledge, Stacey A.

    2014-01-01

    Policymakers are revolutionizing teacher evaluation by attaching greater stakes to student test scores and observation-based teacher effectiveness measures, but relatively little is known about why they often differ so much. Quantitative analysis of thirty schools suggests that teacher value-added measures and informal principal evaluations are…

  20. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  1. Experimental Assessment and Enhancement of Planar Laser-Induced Fluorescence Measurements of Nitric Oxide in an Inverse Diffusion Flame

    NASA Technical Reports Server (NTRS)

    Partridge, William P.; Laurendeau, Normand M.

    1997-01-01

    We have experimentally assessed the quantitative nature of planar laser-induced fluorescence (PLIF) measurements of NO concentration in a unique atmospheric pressure, laminar, axial inverse diffusion flame (IDF). The PLIF measurements were assessed relative to a two-dimensional array of separate laser saturated fluorescence (LSF) measurements. We demonstrated and evaluated several experimentally-based procedures for enhancing the quantitative nature of PLIF concentration images. Because these experimentally-based PLIF correction schemes require only the ability to make PLIF and LSF measurements, they produce a more broadly applicable PLIF diagnostic compared to numerically-based correction schemes. We experimentally assessed the influence of interferences on both narrow-band and broad-band fluorescence measurements at atmospheric and high pressures. Optimum excitation and detection schemes were determined for the LSF and PLIF measurements. Single-input and multiple-input, experimentally-based PLIF enhancement procedures were developed for application in test environments with both negligible and significant quench-dependent error gradients. Each experimentally-based procedure provides an enhancement of approximately 50% in the quantitative nature of the PLIF measurements, and results in concentration images nominally as quantitative as LSF point measurements. These correction procedures can be applied to other species, including radicals, for which no experimental data are available from which to implement numerically-based PLIF enhancement procedures.

  2. Quantitative competitive (QC) PCR for quantification of porcine DNA.

    PubMed

    Wolf, C; Lüthy, J

    2001-02-01

    Many meat products nowadays may contain several species in different proportions. To protect consumers from fraud and misdeclarations, not only a qualitative but also a quantitative monitoring of ingredients of complex food products is necessary. DNA based techniques like the polymerase chain reaction (PCR) are widely used for identification of species but no answer to the proportional amount of a certain species could be given using current techniques. In this study we report the development and evaluation of a quantitative competitive polymerase chain reaction (QC-PCR) for detection and quantification of porcine DNA using a new porcine specific PCR system based on the growth hormone gene of sus scrofa. A DNA competitor differing by 30 bp in length from the porcine target sequence was constructed and used for PCR together with the target DNA. Specificity of the new primers was evaluated with DNA from cattle, sheep, chicken and turkey. The competitor concentration was adjusted to porcine DNA contents of 2 or 20% by coamplification of mixtures containing porcine and corresponding amounts of bovine DNA in defined ratios.

  3. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  4. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  5. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  6. [Evaluation by case managers dementia : An explorative practice based study on types and content].

    PubMed

    Ketelaar, Nicole A B M; Jukema, Jan S; van Bemmel, Marlies; Adriaansen, Marian J M; Smits, Carolien H M

    2017-06-01

    This practice based explorative study aims to provide insight into the ways in which case managers shape and fill up the evaluation phase of their support of the informal care network of persons with dementia. A combination of quantitative and qualitative research methods were used. A group of 57 case managers of persons with dementia in three different organisational networks took part in this study. Results from the quantitative and qualitative data are organized into four themes: (1) attitude towards evaluation, (2) forms of evaluation, (3) implementation of evaluation and (4) content of evaluation. There are different ways in shaping evaluation and the content of it. The importance of interim and final evaluation is recognized, but is difficult to realize in a methodical way. Barriers experienced by the case managers include various factors associated both with clients as professionals. Case managers evaluate continuously and in an informal way to assess whether the extent of their assistance is meeting the needs of the client and informal network. Case managers do not use systematic evaluation to measure the quality of care they offer to persons with dementia and their caregivers. The findings demand a discussion on the level of clients, as well as on the professional and societal level about the way case managers should evaluate their support.

  7. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  8. Histopathological image analysis of chemical-induced hepatocellular hypertrophy in mice.

    PubMed

    Asaoka, Yoshiji; Togashi, Yuko; Mutsuga, Mayu; Imura, Naoko; Miyoshi, Tomoya; Miyamoto, Yohei

    2016-04-01

    Chemical-induced hepatocellular hypertrophy is frequently observed in rodents, and is mostly caused by the induction of phase I and phase II drug metabolic enzymes and peroxisomal lipid metabolic enzymes. Liver weight is a sensitive and commonly used marker for detecting hepatocellular hypertrophy, but is also increased by a number of other factors. Histopathological observations subjectively detect changes such as hepatocellular hypertrophy based on the size of a hepatocyte. Therefore, quantitative microscopic observations are required to evaluate histopathological alterations objectively. In the present study, we developed a novel quantitative method for an image analysis of hepatocellular hypertrophy using liver sections stained with hematoxylin and eosin, and demonstrated its usefulness for evaluating hepatocellular hypertrophy induced by phenobarbital (a phase I and phase II enzyme inducer) and clofibrate (a peroxisomal enzyme inducer) in mice. The algorithm of this imaging analysis was designed to recognize an individual hepatocyte through a combination of pixel-based and object-based analyses. Hepatocellular nuclei and the surrounding non-hepatocellular cells were recognized by the pixel-based analysis, while the areas of the recognized hepatocellular nuclei were then expanded until they ran against their expanding neighboring hepatocytes and surrounding non-hepatocellular cells by the object-based analysis. The expanded area of each hepatocellular nucleus was regarded as the size of an individual hepatocyte. The results of this imaging analysis showed that changes in the sizes of hepatocytes corresponded with histopathological observations in phenobarbital and clofibrate-treated mice, and revealed a correlation between hepatocyte size and liver weight. In conclusion, our novel image analysis method is very useful for quantitative evaluations of chemical-induced hepatocellular hypertrophy. Copyright © 2015 Elsevier GmbH. All rights reserved.

  9. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  10. Evaluation of an interactive, case-based review session in teaching medical microbiology.

    PubMed

    Blewett, Earl L; Kisamore, Jennifer L

    2009-08-27

    Oklahoma State University-Center for Health Sciences (OSU-CHS) has replaced its microbiology wet laboratory with a variety of tutorials including a case-based interactive session called Microbial Jeopardy!. The question remains whether the time spent by students and faculty in the interactive case-based tutorial is worthwhile? This study was designed to address this question by analyzing both student performance data and assessing students' perceptions regarding the tutorial. Both quantitative and qualitative data were used in the current study. Part One of the study involved assessing student performance using archival records of seven case-based exam questions used in the 2004, 2005, 2006, and 2007 OSU-CHS Medical Microbiology course. Two sample t-tests for proportions were used to test for significant differences related to tutorial usage. Part Two used both quantitative and qualitative means to assess student's perceptions of the Microbial Jeopardy! session. First, a retrospective survey was administered to students who were enrolled in Medical Microbiology in 2006 or 2007. Second, responses to open-ended items from the 2008 course evaluations were reviewed for comments regarding the Microbial Jeopardy! session. Both student performance and student perception data support continued use of the tutorials. Quantitative and qualitative data converge to suggest that students like and learn from the interactive, case-based session. The case-based tutorial appears to improve student performance on case-based exam questions. Additionally, students perceived the tutorial as helpful in preparing for exam questions and reviewing the course material. The time commitment for use of the case-based tutorial appears to be justified.

  11. Evaluation of an interactive, case-based review session in teaching medical microbiology

    PubMed Central

    Blewett, Earl L; Kisamore, Jennifer L

    2009-01-01

    Background Oklahoma State University-Center for Health Sciences (OSU-CHS) has replaced its microbiology wet laboratory with a variety of tutorials including a case-based interactive session called Microbial Jeopardy!. The question remains whether the time spent by students and faculty in the interactive case-based tutorial is worthwhile? This study was designed to address this question by analyzing both student performance data and assessing students' perceptions regarding the tutorial. Methods Both quantitative and qualitative data were used in the current study. Part One of the study involved assessing student performance using archival records of seven case-based exam questions used in the 2004, 2005, 2006, and 2007 OSU-CHS Medical Microbiology course. Two sample t-tests for proportions were used to test for significant differences related to tutorial usage. Part Two used both quantitative and qualitative means to assess student's perceptions of the Microbial Jeopardy! session. First, a retrospective survey was administered to students who were enrolled in Medical Microbiology in 2006 or 2007. Second, responses to open-ended items from the 2008 course evaluations were reviewed for comments regarding the Microbial Jeopardy! session. Results Both student performance and student perception data support continued use of the tutorials. Quantitative and qualitative data converge to suggest that students like and learn from the interactive, case-based session. Conclusion The case-based tutorial appears to improve student performance on case-based exam questions. Additionally, students perceived the tutorial as helpful in preparing for exam questions and reviewing the course material. The time commitment for use of the case-based tutorial appears to be justified. PMID:19712473

  12. The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process

    DTIC Science & Technology

    2013-03-01

    layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to

  13. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  14. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  15. Innovations in Doctoral Education: Distance Education Methodology Applied

    ERIC Educational Resources Information Center

    Bettmann, Joanna; Thompson, Kimberly; Padykula, Nora; Berzoff, Joan

    2009-01-01

    This study evaluated the impact of a distance education program to meet the practice learning needs of first-year doctoral students. The program, a six-session case-based telephonic seminar, was taught to 19 first-year doctoral students. Evaluation of the program included self-report quantitative and qualitative data gathered pre- and postseminar,…

  16. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  17. Methodology for Evaluating an Adaptation of Evidence-Based Drug Abuse Prevention in Alternative Schools

    ERIC Educational Resources Information Center

    Hopson, Laura M.; Steiker, Lori K. H.

    2008-01-01

    The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high-risk youths in alternative schools. This study used mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods…

  18. An Evaluation of a Shared Leadership Training Program

    ERIC Educational Resources Information Center

    Allen, Lavonda Ann

    2010-01-01

    The purpose of the quantitative quasi-experimental equivalent time series study was to evaluate the effectiveness of a work based shared leadership training program in a less than 200 bed hospital in rural, south-central United States. Nursing shortages and the current emphasis on quality are factors that make recruitment and retention of nurses…

  19. Evaluation of Residential Consumers Knowledge of Wireless Network Security and Its Correlation with Identity Theft

    ERIC Educational Resources Information Center

    Kpaduwa, Fidelis Iheanyi

    2010-01-01

    This current quantitative correlational research study evaluated the residential consumers' knowledge of wireless network security and its relationship with identity theft. Data analysis was based on a sample of 254 randomly selected students. All the study participants completed a survey questionnaire designed to measure their knowledge of…

  20. A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)

    EPA Science Inventory

    Abstract

    In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...

  1. Dominant region: a basic feature for group motion analysis and its application to teamwork evaluation in soccer games

    NASA Astrophysics Data System (ADS)

    Taki, Tsuyoshi; Hasegawa, Jun-ichi

    1998-12-01

    This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.

  2. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    PubMed

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The association of the findings from this study with clinical and radiological examinations requires further investigation. © 2018 American College of Veterinary Radiology.

  3. Safe Passage Data Analysis: Interim Report

    DOT National Transportation Integrated Search

    1993-04-01

    The purpose of this report is to describe quantitatively the costs and benefits of screener : proficiency evaluation and reporting systems (SPEARS) equipment, particularly computer-based : instruction (CBI) systems, compared to current methods of tra...

  4. Does team training work? Principles for health care.

    PubMed

    Salas, Eduardo; DiazGranados, Deborah; Weaver, Sallie J; King, Heidi

    2008-11-01

    Teamwork is integral to a working environment conducive to patient safety and care. Team training is one methodology designed to equip team members with the competencies necessary for optimizing teamwork. There is evidence of team training's effectiveness in highly complex and dynamic work environments, such as aviation and health care. However, most quantitative evaluations of training do not offer any insight into the actual reasons why, how, and when team training is effective. To address this gap in understanding, and to provide guidance for members of the health care community interested in implementing team training programs, this article presents both quantitative results and a specific qualitative review and content analysis of team training implemented in health care. Based on this review, we offer eight evidence-based principles for effective planning, implementation, and evaluation of team training programs specific to health care.

  5. Qualitative and Quantitative Imaging Evaluation of Renal Cell Carcinoma Subtypes with Grating-based X-ray Phase-contrast CT

    NASA Astrophysics Data System (ADS)

    Braunagel, Margarita; Birnbacher, Lorenz; Willner, Marian; Marschner, Mathias; De Marco, Fabio; Viermetz, Manuel; Notohamiprodjo, Susan; Hellbach, Katharina; Auweter, Sigrid; Link, Vera; Woischke, Christine; Reiser, Maximilian F.; Pfeiffer, Franz; Notohamiprodjo, Mike; Herzen, Julia

    2017-03-01

    Current clinical imaging methods face limitations in the detection and correct characterization of different subtypes of renal cell carcinoma (RCC), while these are important for therapy and prognosis. The present study evaluates the potential of grating-based X-ray phase-contrast computed tomography (gbPC-CT) for visualization and characterization of human RCC subtypes. The imaging results for 23 ex vivo formalin-fixed human kidney specimens obtained with phase-contrast CT were compared to the results of the absorption-based CT (gbCT), clinical CT and a 3T MRI and validated using histology. Regions of interest were placed on each specimen for quantitative evaluation. Qualitative and quantitative gbPC-CT imaging could significantly discriminate between normal kidney cortex (54 ± 4 HUp) and clear cell (42 ± 10), papillary (43 ± 6) and chromophobe RCCs (39 ± 7), p < 0.05 respectively. The sensitivity for detection of tumor areas was 100%, 50% and 40% for gbPC-CT, gbCT and clinical CT, respectively. RCC architecture like fibrous strands, pseudocapsules, necrosis or hyalinization was depicted clearly in gbPC-CT and was not equally well visualized in gbCT, clinical CT and MRI. The results show that gbPC-CT enables improved discrimination of normal kidney parenchyma and tumorous tissues as well as different soft-tissue components of RCCs without the use of contrast media.

  6. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  7. Reflectance spectroscopy for evaluating hair follicle cycle

    NASA Astrophysics Data System (ADS)

    Liu, Caihua; Guan, Yue; Wang, Jianru; Zhu, Dan

    2014-02-01

    Hair follicle, as a mini-organ with perpetually cycling of telogen, anagen and catagen, provides a valuable experimental model for studying hair and organ regeneration. The transition of hair follicle from telogen to anagen is a significant sign for successful regeneration. So far discrimination of the hair follicle stage is mostly based on canonical histological examination and empirical speculation based on skin color. Hardly a method has been proposed to quantitatively evaluate the hair follicle stage. In this work, a commercial optical fiber spectrometer was applied to monitor diffuse reflectance of mouse skin with hair follicle cycling, and then the change of reflectance was obtained. Histological examination was used to verify the hair follicle stage. In comparison with the histological examination, the skin diffuse reflectance was relatively high for mouse with telogen hair follicles; it decreased once hair follicles transited to anagen stage; then it increased reversely at catagen stage. This study provided a new method to quantitatively evaluate the hair follicle stage, and should be valuable for the basic and therapeutic investigations on hair regeneration.

  8. Rational drug therapy education in clinical phase carried out by task-based learning

    PubMed Central

    Bilge, S. Sırrı; Akyüz, Bahar; Ağrı, Arzu Erdal; Özlem, Mıdık

    2017-01-01

    Objectives: Irrational drug use results in drug interactions, treatment noncompliance, and drug resistance. Rational pharmacotherapy education is being implemented in many faculties of medicine. Our aim is to introduce rational pharmacotherapy education by clinicians and to evaluate task-based rational drug therapy education in the clinical context. Methods: The Kirkpatrick's evaluation model was used for the evaluation of the program. The participants evaluated the program in terms of constituents of the program, utilization, and contribution to learning. Voluntary participants responded to the evaluation forms after the educational program. Data are evaluated using both quantitative and qualitative tools. SPSS (version 21) used for quantitative data for determining mean and standard deviation values. Descriptive qualitative analysis approach is used for the analysis of open-ended questions. Results: It was revealed that the program and its components have been favorable. A total 95.9% of the students consider the education to be beneficial. Simulated patients practice and personal drug choice/problem-based learning sessions were appreciated by the students in particular. 93.9% of the students stated that all students of medicine should undergo this educational program. Among the five presentations contained in the program, “The Principles of Prescribing” received the highest points (9 ± 1.00) from participating students in general evaluation of the educational program. Conclusion: This study was carried out to improve task-based rational drug therapy education. According to feedback from the students concerning content, method, resource, assessment, and program design; some important changes, especially in number of facilitators and indications, are made in rational pharmacotherapy education in clinical task-based learning program. PMID:28458432

  9. Cell adhesion monitoring of human induced pluripotent stem cell based on intrinsic molecular charges

    NASA Astrophysics Data System (ADS)

    Sugimoto, Haruyo; Sakata, Toshiya

    2014-01-01

    We have shown a simple way for real-time, quantitative, non-invasive, and non-label monitoring of human induced pluripotent stem (iPS) cell adhesion by use of a biologically coupled-gate field effect transistor (bio-FET), which is based on detection of molecular charges at cell membrane. The electrical behavior revealed quantitatively the electrical contacts of integrin-receptor at the cell membrane with RGDS peptide immobilized at the gate sensing surface, because that binding site was based on cationic α chain of integrin. The platform based on the bio-FET would provide substantial information to evaluate cell/material bio-interface and elucidate biding mechanism of adhesion molecules, which could not be interpreted by microscopic observation.

  10. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  11. Developing and Assessing E-Learning Techniques for Teaching Forecasting

    ERIC Educational Resources Information Center

    Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian

    2014-01-01

    In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…

  12. Towards evidence-based practice in medical training: making evaluations more meaningful.

    PubMed

    Drescher, Uta; Warren, Fiona; Norton, Kingsley

    2004-12-01

    The evaluation of training is problematic and the evidence base inconclusive. This situation may arise for 2 main reasons: training is not understood as a complex intervention and, related to this, the evaluation methods applied are often overly simplistic. This paper makes the case for construing training, especially in the field of specialist medical education, as a complex intervention. It also selectively reviews the available literature in order to match evaluative techniques with the demonstrated complexity. Construing training as a complex intervention can provide a framework for selecting the most appropriate methodology to evaluate a given training intervention and to appraise the evidence base for training fairly, choosing from among both quantitative and qualitative approaches and applying measurement at multiple levels of training impact.

  13. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  14. Sub-band denoising and spline curve fitting method for hemodynamic measurement in perfusion MRI

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Huang, Hsiao-Ling; Hsu, Yuan-Yu; Chen, Chi-Chen; Chen, Ing-Yi; Wu, Liang-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2003-05-01

    In clinical research, non-invasive MR perfusion imaging is capable of investigating brain perfusion phenomenon via various hemodynamic measurements, such as cerebral blood volume (CBV), cerebral blood flow (CBF), and mean trasnit time (MTT). These hemodynamic parameters are useful in diagnosing brain disorders such as stroke, infarction and periinfarct ischemia by further semi-quantitative analysis. However, the accuracy of quantitative analysis is usually affected by poor signal-to-noise ratio image quality. In this paper, we propose a hemodynamic measurement method based upon sub-band denoising and spline curve fitting processes to improve image quality for better hemodynamic quantitative analysis results. Ten sets of perfusion MRI data and corresponding PET images were used to validate the performance. For quantitative comparison, we evaluate gray/white matter CBF ratio. As a result, the hemodynamic semi-quantitative analysis result of mean gray to white matter CBF ratio is 2.10 +/- 0.34. The evaluated ratio of brain tissues in perfusion MRI is comparable to PET technique is less than 1-% difference in average. Furthermore, the method features excellent noise reduction and boundary preserving in image processing, and short hemodynamic measurement time.

  15. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  16. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  17. Quantitative Comparison of PET and Bremsstrahlung SPECT for Imaging the In Vivo Yttrium-90 Microsphere Distribution after Liver Radioembolization

    PubMed Central

    Elschot, Mattijs; Vermolen, Bart J.; Lam, Marnix G. E. H.; de Keizer, Bart; van den Bosch, Maurice A. A. J.; de Jong, Hugo W. A. M.

    2013-01-01

    Background After yttrium-90 (90Y) microsphere radioembolization (RE), evaluation of extrahepatic activity and liver dosimetry is typically performed on 90Y Bremsstrahlung SPECT images. Since these images demonstrate a low quantitative accuracy, 90Y PET has been suggested as an alternative. The aim of this study is to quantitatively compare SPECT and state-of-the-art PET on the ability to detect small accumulations of 90Y and on the accuracy of liver dosimetry. Methodology/Principal Findings SPECT/CT and PET/CT phantom data were acquired using several acquisition and reconstruction protocols, including resolution recovery and Time-Of-Flight (TOF) PET. Image contrast and noise were compared using a torso-shaped phantom containing six hot spheres of various sizes. The ability to detect extra- and intrahepatic accumulations of activity was tested by quantitative evaluation of the visibility and unique detectability of the phantom hot spheres. Image-based dose estimates of the phantom were compared to the true dose. For clinical illustration, the SPECT and PET-based estimated liver dose distributions of five RE patients were compared. At equal noise level, PET showed higher contrast recovery coefficients than SPECT. The highest contrast recovery coefficients were obtained with TOF PET reconstruction including resolution recovery. All six spheres were consistently visible on SPECT and PET images, but PET was able to uniquely detect smaller spheres than SPECT. TOF PET-based estimates of the dose in the phantom spheres were more accurate than SPECT-based dose estimates, with underestimations ranging from 45% (10-mm sphere) to 11% (37-mm sphere) for PET, and 75% to 58% for SPECT, respectively. The differences between TOF PET and SPECT dose-estimates were supported by the patient data. Conclusions/Significance In this study we quantitatively demonstrated that the image quality of state-of-the-art PET is superior over Bremsstrahlung SPECT for the assessment of the 90Y microsphere distribution after radioembolization. PMID:23405207

  18. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  19. Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study

    PubMed Central

    Sappa, Angel D.; Carvajal, Juan A.; Aguilera, Cristhian A.; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X.

    2016-01-01

    This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR). PMID:27294938

  20. Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study.

    PubMed

    Sappa, Angel D; Carvajal, Juan A; Aguilera, Cristhian A; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X

    2016-06-10

    This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR).

  1. Towards standardized assessment of endoscope optical performance: geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua

    2013-12-01

    Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.

  2. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    PubMed

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  3. Quantitative assessment of participant knowledge and evaluation of participant satisfaction in the CARES training program.

    PubMed

    Goodman, Melody S; Si, Xuemei; Stafford, Jewel D; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2012-01-01

    The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research methodology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community-academic research partnerships.

  4. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  5. Quantitative Validation of the Presto Blue Metabolic Assay for Online Monitoring of Cell Proliferation in a 3D Perfusion Bioreactor System.

    PubMed

    Sonnaert, Maarten; Papantoniou, Ioannis; Luyten, Frank P; Schrooten, Jan Ir

    2015-06-01

    As the fields of tissue engineering and regenerative medicine mature toward clinical applications, the need for online monitoring both for quantitative and qualitative use becomes essential. Resazurin-based metabolic assays are frequently applied for determining cytotoxicity and have shown great potential for monitoring 3D bioreactor-facilitated cell culture. However, no quantitative correlation between the metabolic conversion rate of resazurin and cell number has been defined yet. In this work, we determined conversion rates of Presto Blue, a resazurin-based metabolic assay, for human periosteal cells during 2D and 3D static and 3D perfusion cultures. Our results showed that for the evaluated culture systems there is a quantitative correlation between the Presto Blue conversion rate and the cell number during the expansion phase with no influence of the perfusion-related parameters, that is, flow rate and shear stress. The correlation between the cell number and Presto Blue conversion subsequently enabled the definition of operating windows for optimal signal readouts. In conclusion, our data showed that the conversion of the resazurin-based Presto Blue metabolic assay can be used as a quantitative readout for online monitoring of cell proliferation in a 3D perfusion bioreactor system, although a system-specific validation is required.

  6. Quantitative Validation of the Presto Blue™ Metabolic Assay for Online Monitoring of Cell Proliferation in a 3D Perfusion Bioreactor System

    PubMed Central

    Sonnaert, Maarten; Papantoniou, Ioannis; Luyten, Frank P.

    2015-01-01

    As the fields of tissue engineering and regenerative medicine mature toward clinical applications, the need for online monitoring both for quantitative and qualitative use becomes essential. Resazurin-based metabolic assays are frequently applied for determining cytotoxicity and have shown great potential for monitoring 3D bioreactor-facilitated cell culture. However, no quantitative correlation between the metabolic conversion rate of resazurin and cell number has been defined yet. In this work, we determined conversion rates of Presto Blue™, a resazurin-based metabolic assay, for human periosteal cells during 2D and 3D static and 3D perfusion cultures. Our results showed that for the evaluated culture systems there is a quantitative correlation between the Presto Blue conversion rate and the cell number during the expansion phase with no influence of the perfusion-related parameters, that is, flow rate and shear stress. The correlation between the cell number and Presto Blue conversion subsequently enabled the definition of operating windows for optimal signal readouts. In conclusion, our data showed that the conversion of the resazurin-based Presto Blue metabolic assay can be used as a quantitative readout for online monitoring of cell proliferation in a 3D perfusion bioreactor system, although a system-specific validation is required. PMID:25336207

  7. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  8. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  9. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  10. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. The Biomarker-Surrogacy Evaluation Schema: a review of the biomarker-surrogate literature and a proposal for a criterion-based, quantitative, multidimensional hierarchical levels of evidence schema for evaluating the status of biomarkers as surrogate endpoints.

    PubMed

    Lassere, Marissa N

    2008-06-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Section 2 is a systematic, historical review of the biomarker-surrogate endpoint literature with special reference to the nomenclature, the systems of classification and statistical methods developed for their evaluation. In Section 3 an explicit, criterion-based, quantitative, multidimensional hierarchical levels of evidence schema - Biomarker-Surrogacy Evaluation Schema - is proposed to evaluate and co-ordinate the multiple dimensions (biological, epidemiological, statistical, clinical trial and risk-benefit evidence) of the biomarker clinical endpoint relationships. The schema systematically evaluates and ranks the surrogacy status of biomarkers and surrogate endpoints using defined levels of evidence. The schema incorporates the three independent domains: Study Design, Target Outcome and Statistical Evaluation. Each domain has items ranked from zero to five. An additional category called Penalties incorporates additional considerations of biological plausibility, risk-benefit and generalizability. The total score (0-15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. The term ;surrogate' is restricted to markers attaining Levels 1 or 2 only. Surrogacy status of markers can then be directly compared within and across different areas of medicine to guide individual, trial-based or drug-development decisions. This schema would facilitate communication between clinical, researcher, regulatory, industry and consumer participants necessary for evaluation of the biomarker-surrogate-clinical endpoint relationship in their different settings.

  12. Smartphone-based imaging of the corneal endothelium at sub-cellular resolution

    NASA Astrophysics Data System (ADS)

    Toslak, Devrim; Thapa, Damber; Erol, Muhammet Kazim; Chen, Yanjun; Yao, Xincheng

    2017-07-01

    This aim of this study was to test the feasibility of smartphone-based specular microscopy of the corneal endothelium at a sub-cellular resolution. Quantitative examination of endothelial cells is essential for evaluating corneal disease such as determining a diagnosis, monitoring progression and assessing treatment. Smartphone-based technology promises a new opportunity to develop affordable devices to foster quantitative examination of endothelial cells in rural and underserved areas. In our study, we incorporated an iPhone 6 and a slit lamp to demonstrate the feasibility of smartphone-based microscopy of the corneal endothelium at a sub-cellular resolution. The sub-cellular resolution images allowed quantitative calculation of the endothelial cell density. Comparative measurements revealed a normal endothelial cell density of 2978 cells/mm2 in the healthy cornea, and a significantly reduced cell density of 1466 cells/mm2 in the diseased cornea with Fuchs' dystrophy. Our ultimate goal is to develop a smartphone-based telemedicine device for low-cost examination of the corneal endothelium, which can benefit patients in rural areas and underdeveloped countries to reduce health care disparities.

  13. RNA-based determination of ESR1 and HER2 expression and response to neoadjuvant chemotherapy.

    PubMed

    Denkert, C; Loibl, S; Kronenwett, R; Budczies, J; von Törne, C; Nekljudova, V; Darb-Esfahani, S; Solbach, C; Sinn, B V; Petry, C; Müller, B M; Hilfrich, J; Altmann, G; Staebler, A; Roth, C; Ataseven, B; Kirchner, T; Dietel, M; Untch, M; von Minckwitz, G

    2013-03-01

    Hormone and human epidermal growth factor receptor 2 (HER2) receptors are the most important breast cancer biomarkers, and additional objective and quantitative test methods such as messenger RNA (mRNA)-based quantitative analysis are urgently needed. In this study, we investigated the clinical validity of RT-PCR-based evaluation of estrogen receptor (ESR1) and HER2 mRNA expression. A total of 1050 core biopsies from two retrospective (GeparTrio, GeparQuattro) and one prospective (PREDICT) neoadjuvant studies were evaluated by quantitative RT-PCR for ESR1 and HER2. ESR1 mRNA was significantly predictive for reduced response to neoadjuvant chemotherapy in univariate and multivariate analysis in all three cohorts. The complete pathologically documented response (pathological complete response, pCR) rate for ESR1+/HER2- tumors was 7.3%, 8.0% and 8.6%; for ESR1-/HER2- tumors it was 34.4%, 33.7% and 37.3% in GeparTrio, GeparQuattro and PREDICT, respectively (P < 0.001 in each cohort). In the Kaplan-Meier analysis in GeparTrio patients with ESR1+/HER2- tumors had the best prognosis, compared with ESR1-/HER2- and ESR1-/HER2+ tumors [disease-free survival (DFS): P < 0.0005, overall survival (OS): P < 0.0005]. Our results suggest that mRNA levels of ESR1 and HER2 predict response to neoadjuvant chemotherapy and are significantly associated with long-term outcome. As an additional option to standard immunohistochemistry and gene-array-based analysis, quantitative RT-PCR analysis might be useful for determination of the receptor status in breast cancer.

  14. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  15. Physical interpretation and development of ultrasonic nondestructive evaluation techniques applied to the quantitative characterization of textile composite materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1993-01-01

    In this Progress Report, we describe our current research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the characterization of stitched composite materials and bonded aluminum plate specimens. One purpose of this investigation is to identify and characterize specific features of polar backscatter interrogation which enhance the ability of ultrasound to detect flaws in a stitched composite laminate. Another focus is to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize bonded aluminum lap joints. As an approach to implementing quantitative ultrasonic inspection methods to both of these materials, we focus on the physics that underlies the detection of flaws in such materials.

  16. Cardiac Rehabilitation Online Pilot: Extending Reach of Cardiac Rehabilitation.

    PubMed

    Higgins, Rosemary O; Rogerson, Michelle; Murphy, Barbara M; Navaratnam, Hema; Butler, Michael V; Barker, Lauren; Turner, Alyna; Lefkovits, Jeffrey; Jackson, Alun C

    While cardiac rehabilitation (CR) is recommended for all patients after an acute cardiac event, limitations exist in reach. The purpose of the current study was to develop and pilot a flexible online CR program based on self-management principles "Help Yourself Online." The program was designed as an alternative to group-based CR as well as to complement traditional CR. The program was based on existing self-management resources developed previously by the Heart Research Centre. Twenty-one patients admitted to Cabrini Health for an acute cardiac event were recruited to test the program. The program was evaluated using qualitative and quantitative methods. Quantitative results demonstrated that patients believed the program would assist them in their self-management. Qualitative evaluation, using focus group and interview methods with 15 patients, showed that patients perceived the online CR approach to be a useful instrument for self-management. Broader implications of the data include the acceptability of the intervention, timing of intervention delivery, and patients' desire for additional online community support.

  17. NHS-based Tandem Mass Tagging of Proteins at the Level of Whole Cells: A Critical Evaluation in Comparison to Conventional TMT-Labeling Approaches for Quantitative Proteome Analysis.

    PubMed

    Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara

    2017-01-01

    Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.

  18. Study on index system of GPS interference effect evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Zeng, Fangling; Zhao, Yuan; Zeng, Ruiqi

    2018-05-01

    Satellite navigation interference effect evaluation is the key technology to break through the research of Navigation countermeasure. To evaluate accurately the interference degree and Anti-jamming ability of GPS receiver, this text based on the existing research results of Navigation interference effect evaluation, build the index system of GPS receiver effectiveness evaluation from four levels of signal acquisition, tracking, demodulation and positioning/timing and establish the model for each index. These indexes can accurately and quantitatively describe the interference effect at all levels.

  19. An Evaluation of a Community Health Intervention Programme Aimed at Improving Health and Wellbeing

    ERIC Educational Resources Information Center

    Strachan, G.; Wright, G. D.; Hancock, E.

    2007-01-01

    Objective: The objective of this evaluation was to examine the extent to which participants in the Tailor Made Leisure Package programme experienced any improvement in their health and wellbeing. Design: A quantitative survey. Setting: The Healthy Living Centre initiative is an example of a community-based intervention which was formalized as part…

  20. Making a Game out of It: Using Web-Based Competitive Quizzes for Quantitative Analysis Content Review

    ERIC Educational Resources Information Center

    Grinias, James P.

    2017-01-01

    Online student-response systems provide instructors with an easy-to-use tool to instantly evaluate student comprehension. For comprehensive content review, turning this evaluation into a competitive game where students can compete against each other was found to be helpful and enjoyable for participating students. One specific online resource,…

  1. The GAENE--Generalized Acceptance of EvolutioN Evaluation: Development of a New Measure of Evolution Acceptance

    ERIC Educational Resources Information Center

    Smith, Mike U.; Snyder, Scott W.; Devereaux, Randolph S.

    2016-01-01

    The present study reports the development of a brief, quantitative, web-based, psychometrically sound measure--the Generalized Acceptance of EvolutioN Evaluation (GAENE, pronounced "gene") in a format that is useful in large and small groups, in research, and in classroom settings. The measure was designed to measure only evolution…

  2. A methodology for evaluation of a markup-based specification of clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a three-phase, nine-step methodology for specification of clinical guidelines (GLs) by expert physicians, clinical editors, and knowledge engineers, and for quantitative evaluation of the specification's quality. We applied this methodology to a particular framework for incremental GL structuring (mark-up) and to GLs in three clinical domains with encouraging results.

  3. Configuration evaluation and criteria plan. Volume 2: Evaluation critera plan (preliminary). Space Transportation Main Engine (STME) configuration study

    NASA Technical Reports Server (NTRS)

    Bair, E. K.

    1986-01-01

    The unbiased selection of the Space Transportation Main Engine (STME) configuration requires that the candidate engines be evaluated against a predetermined set of criteria which must be properly weighted to emphasize critical requirements defined prior to the actual evaluation. The evaluation and selection process involves the following functions: (1) determining if a configuration can satisfy basic STME requirements (yes/no); (2) defining the evaluation criteria; (3) selecting the criteria relative importance or weighting; (4) determining the weighting sensitivities; and (5) establishing a baseline for engine evaluation. The criteria weighting and sensitivities are cost related and are based on mission models and vehicle requirements. The evaluation process is used as a coarse screen to determine the candidate engines for the parametric studies and as a fine screen to determine concept(s) for conceptual design. The criteria used for the coarse and fine screen evaluation process is shown. The coarse screen process involves verifying that the candidate engines can meet the yes/no screening requirements and a semi-subjective quantitative evaluation. The fine screen engines have to meet all of the yes/no screening gates and are then subjected to a detailed evaluation or assessment using the quantitative cost evaluation processes. The option exists for re-cycling a concept through the quantitative portion of the screening and allows for some degree of optimization. The basic vehicle is a two stage LOX/HC, LOX/LH2 parallel burn vehicle capable of placing 150,000 lbs in low Earth orbit (LEO).

  4. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  5. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  6. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  7. Newly Built Undergraduate Schools Should Place Great Emphasis on Connotation Construction and Quality Promotion: An Analysis Based on the Qualification Evaluation Results for 41 Undergraduate Schools

    ERIC Educational Resources Information Center

    Binglin, Zhong

    2016-01-01

    The article presents a quantitative analysis of the evaluation results for 41 newly built undergraduate schools that submitted to the qualification evaluation of undergraduate work by Ministry of Education in 2013. It shows that newly built undergraduate schools should place great emphasis on connotation construction and quality promotion and on…

  8. Fuzzy Performance between Surface Fitting and Energy Distribution in Turbulence Runner

    PubMed Central

    Liang, Zhongwei; Liu, Xiaochu; Ye, Bangyan; Brauwer, Richard Kars

    2012-01-01

    Because the application of surface fitting algorithms exerts a considerable fuzzy influence on the mathematical features of kinetic energy distribution, their relation mechanism in different external conditional parameters must be quantitatively analyzed. Through determining the kinetic energy value of each selected representative position coordinate point by calculating kinetic energy parameters, several typical algorithms of complicated surface fitting are applied for constructing microkinetic energy distribution surface models in the objective turbulence runner with those obtained kinetic energy values. On the base of calculating the newly proposed mathematical features, we construct fuzzy evaluation data sequence and present a new three-dimensional fuzzy quantitative evaluation method; then the value change tendencies of kinetic energy distribution surface features can be clearly quantified, and the fuzzy performance mechanism discipline between the performance results of surface fitting algorithms, the spatial features of turbulence kinetic energy distribution surface, and their respective environmental parameter conditions can be quantitatively analyzed in detail, which results in the acquirement of final conclusions concerning the inherent turbulence kinetic energy distribution performance mechanism and its mathematical relation. A further turbulence energy quantitative study can be ensured. PMID:23213287

  9. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  10. Accuracy of lung nodule density on HRCT: analysis by PSF-based image simulation.

    PubMed

    Ohno, Ken; Ohkubo, Masaki; Marasinghe, Janaka C; Murao, Kohei; Matsumoto, Toru; Wada, Shinichi

    2012-11-08

    A computed tomography (CT) image simulation technique based on the point spread function (PSF) was applied to analyze the accuracy of CT-based clinical evaluations of lung nodule density. The PSF of the CT system was measured and used to perform the lung nodule image simulation. Then, the simulated image was resampled at intervals equal to the pixel size and the slice interval found in clinical high-resolution CT (HRCT) images. On those images, the nodule density was measured by placing a region of interest (ROI) commonly used for routine clinical practice, and comparing the measured value with the true value (a known density of object function used in the image simulation). It was quantitatively determined that the measured nodule density depended on the nodule diameter and the image reconstruction parameters (kernel and slice thickness). In addition, the measured density fluctuated, depending on the offset between the nodule center and the image voxel center. This fluctuation was reduced by decreasing the slice interval (i.e., with the use of overlapping reconstruction), leading to a stable density evaluation. Our proposed method of PSF-based image simulation accompanied with resampling enables a quantitative analysis of the accuracy of CT-based evaluations of lung nodule density. These results could potentially reveal clinical misreadings in diagnosis, and lead to more accurate and precise density evaluations. They would also be of value for determining the optimum scan and reconstruction parameters, such as image reconstruction kernels and slice thicknesses/intervals.

  11. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  12. Providing web-based mental health services to at-risk women

    PubMed Central

    2011-01-01

    Background We examined the feasibility of providing web-based mental health services, including synchronous internet video conferencing of an evidence-based support/education group, to at-risk women, specifically poor lone mothers. The objectives of this study were to: (i) adapt a face-to-face support/education group intervention to a web-based format for lone mothers, and (ii) evaluate lone mothers' response to web-based services, including an online video conferencing group intervention program. Methods Participating mothers were recruited through advertisements. To adapt the face-to-face intervention to a web-based format, we evaluated participant motivation through focus group/key informant interviews (n = 7), adapted the intervention training manual for a web-based environment and provided a computer training manual. To evaluate response to web-based services, we provided the intervention to two groups of lone mothers (n = 15). Pre-post quantitative evaluation of mood, self-esteem, social support and parenting was done. Post intervention follow up interviews explored responses to the group and to using technology to access a health service. Participants received $20 per occasion of data collection. Interviews were taped, transcribed and content analysis was used to code and interpret the data. Adherence to the intervention protocol was evaluated. Results Mothers participating in this project experienced multiple difficulties, including financial and mood problems. We adapted the intervention training manual for use in a web-based group environment and ensured adherence to the intervention protocol based on viewing videoconferencing group sessions and discussion with the leaders. Participant responses to the group intervention included decreased isolation, and increased knowledge and confidence in themselves and their parenting; the responses closely matched those of mothers who obtained same service in face-to-face groups. Pre-and post-group quantitative evaluations did not show significant improvements on measures, although the study was not powered to detect these. Conclusions We demonstrated that an evidence-based group intervention program for lone mothers developed and evaluated in face-to-face context transferred well to an online video conferencing format both in terms of group process and outcomes. PMID:21854563

  13. Providing web-based mental health services to at-risk women.

    PubMed

    Lipman, Ellen L; Kenny, Meghan; Marziali, Elsa

    2011-08-19

    We examined the feasibility of providing web-based mental health services, including synchronous internet video conferencing of an evidence-based support/education group, to at-risk women, specifically poor lone mothers. The objectives of this study were to: (i) adapt a face-to-face support/education group intervention to a web-based format for lone mothers, and (ii) evaluate lone mothers' response to web-based services, including an online video conferencing group intervention program. Participating mothers were recruited through advertisements. To adapt the face-to-face intervention to a web-based format, we evaluated participant motivation through focus group/key informant interviews (n = 7), adapted the intervention training manual for a web-based environment and provided a computer training manual. To evaluate response to web-based services, we provided the intervention to two groups of lone mothers (n = 15). Pre-post quantitative evaluation of mood, self-esteem, social support and parenting was done. Post intervention follow up interviews explored responses to the group and to using technology to access a health service. Participants received $20 per occasion of data collection. Interviews were taped, transcribed and content analysis was used to code and interpret the data. Adherence to the intervention protocol was evaluated. Mothers participating in this project experienced multiple difficulties, including financial and mood problems. We adapted the intervention training manual for use in a web-based group environment and ensured adherence to the intervention protocol based on viewing videoconferencing group sessions and discussion with the leaders. Participant responses to the group intervention included decreased isolation, and increased knowledge and confidence in themselves and their parenting; the responses closely matched those of mothers who obtained same service in face-to-face groups. Pre-and post-group quantitative evaluations did not show significant improvements on measures, although the study was not powered to detect these. We demonstrated that an evidence-based group intervention program for lone mothers developed and evaluated in face-to-face context transferred well to an online video conferencing format both in terms of group process and outcomes.

  14. Effectiveness of Inquiry-Based Learning in an Undergraduate Exercise Physiology Course

    ERIC Educational Resources Information Center

    Nybo, Lars; May, Michael

    2015-01-01

    The present study was conducted to investigate the effects of changing a laboratory physiology course for undergraduate students from a traditional step-by-step guided structure to an inquiry-based approach. With this aim in mind, quantitative and qualitative evaluations of learning outcomes (individual subject-specific tests and group interviews)…

  15. Slow erosion of a quantitative apple resistance to Venturia inaequalis based on an isolate-specific Quantitative Trait Locus.

    PubMed

    Caffier, Valérie; Le Cam, Bruno; Al Rifaï, Mehdi; Bellanger, Marie-Noëlle; Comby, Morgane; Denancé, Caroline; Didelot, Frédérique; Expert, Pascale; Kerdraon, Tifenn; Lemarquand, Arnaud; Ravon, Elisa; Durel, Charles-Eric

    2016-10-01

    Quantitative plant resistance affects the aggressiveness of pathogens and is usually considered more durable than qualitative resistance. However, the efficiency of a quantitative resistance based on an isolate-specific Quantitative Trait Locus (QTL) is expected to decrease over time due to the selection of isolates with a high level of aggressiveness on resistant plants. To test this hypothesis, we surveyed scab incidence over an eight-year period in an orchard planted with susceptible and quantitatively resistant apple genotypes. We sampled 79 Venturia inaequalis isolates from this orchard at three dates and we tested their level of aggressiveness under controlled conditions. Isolates sampled on resistant genotypes triggered higher lesion density and exhibited a higher sporulation rate on apple carrying the resistance allele of the QTL T1 compared to isolates sampled on susceptible genotypes. Due to this ability to select aggressive isolates, we expected the QTL T1 to be non-durable. However, our results showed that the quantitative resistance based on the QTL T1 remained efficient in orchard over an eight-year period, with only a slow decrease in efficiency and no detectable increase of the aggressiveness of fungal isolates over time. We conclude that knowledge on the specificity of a QTL is not sufficient to evaluate its durability. Deciphering molecular mechanisms associated with resistance QTLs, genetic determinants of aggressiveness and putative trade-offs within pathogen populations is needed to help in understanding the erosion processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Variation compensation and analysis on diaphragm curvature analysis for emphysema quantification on whole lung CT scans

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Barr, R. Graham; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    CT scans allow for the quantitative evaluation of the anatomical bases of emphysema. Recently, a non-density based geometric measurement of lung diagphragm curvature has been proposed as a method for the quantification of emphysema from CT. This work analyzes variability of diaphragm curvature and evaluates the effectiveness of a compensation methodology for the reduction of this variability as compared to emphysema index. Using a dataset of 43 scan-pairs with less than a 100 day time-interval between scans, we find that the diaphragm curvature had a trend towards lower overall variability over emphysema index (95% CI:-9.7 to + 14.7 vs. -15.8 to +12.0), and that the variation of both measures was reduced after compensation. We conclude that the variation of the new measure can be considered comparable to the established measure and the compensation can reduce the apparent variation of quantitative measures successfully.

  17. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. [Effects of a physical training program on quantitative neurological indices in mild stage type 2 spinocerebelar ataxia patients].

    PubMed

    Pérez-Avila, I; Fernández-Vieitez, J A; Martínez-Góngora, E; Ochoa-Mastrapa, R; Velázquez-Manresa, M G

    Type 2 spinocerebelar ataxia (SCA2) is a neurodegenerative disease with higher prevalence and incidence in Holguín province, Cuba. At present, there is not any drug to counteract the loss of coordinative motor capacities of these patients. Thus physical training seems to be the only way to attenuate the course of disease. To evaluate the effectiveness of a physical training program on quantitative neurological indices in SCA2 patients. A samples of 87 SCA2 patients were studied. All subjects underwent a six month physical exercise program based on coordination, balance and muscular conditioning exercises. Quantitative tests were applied to all patients both before and after the application of the exercise program. Comparisons between pretest versus posttest values were made to evaluate the improvement in neurological indices. All neurological indices both with open eyes and closed eyes significantly improved from pretest to posttest. Static balance, evaluated by Romberg test, also enhanced with training. The exercise training program significantly improved the neurological indices in SCA2 patient with mild stage of disease.

  19. A quantitative risk-assessment system (QR-AS) evaluating operation safety of Organic Rankine Cycle using flammable mixture working fluid.

    PubMed

    Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan

    2017-09-15

    Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  1. A novel quantified bitterness evaluation model for traditional Chinese herbs based on an animal ethology principle.

    PubMed

    Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming

    2018-03-01

    Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.

  2. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  3. The mathematics of a successful deconvolution: a quantitative assessment of mixture-based combinatorial libraries screened against two formylpeptide receptors.

    PubMed

    Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia

    2013-05-30

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.

  4. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  5. Hydrophobic ionic liquids for quantitative bacterial cell lysis with subsequent DNA quantification.

    PubMed

    Fuchs-Telka, Sabine; Fister, Susanne; Mester, Patrick-Julian; Wagner, Martin; Rossmanith, Peter

    2017-02-01

    DNA is one of the most frequently analyzed molecules in the life sciences. In this article we describe a simple and fast protocol for quantitative DNA isolation from bacteria based on hydrophobic ionic liquid supported cell lysis at elevated temperatures (120-150 °C) for subsequent PCR-based analysis. From a set of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide was identified as the most suitable for quantitative cell lysis and DNA extraction because of limited quantitative PCR inhibition by the aqueous eluate as well as no detectable DNA uptake. The newly developed method was able to efficiently lyse Gram-negative bacterial cells, whereas Gram-positive cells were protected by their thick cell wall. The performance of the final protocol resulted in quantitative DNA extraction efficiencies for Gram-negative bacteria similar to those obtained with a commercial kit, whereas the number of handling steps, and especially the time required, was dramatically reduced. Graphical Abstract After careful evaluation of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide ([BMPyr + ][Ntf 2 - ]) was identified as the most suitable ionic liquid for quantitative cell lysis and DNA extraction. When used for Gram-negative bacteria, the protocol presented is simple and very fast and achieves DNA extraction efficiencies similar to those obtained with a commercial kit. ddH 2 O double-distilled water, qPCR quantitative PCR.

  6. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  7. Estimation of total bacteria by real-time PCR in patients with periodontal disease.

    PubMed

    Brajović, Gavrilo; Popović, Branka; Puletić, Miljan; Kostić, Marija; Milasin, Jelena

    2016-01-01

    Periodontal diseases are associated with the presence of elevated levels of bacteria within the gingival crevice. The aim of this study was to evaluate a total amount of bacteria in subgingival plaque samples in patients with a periodontal disease. A quantitative evaluation of total bacteria amount using quantitative real-time polymerase chain reaction (qRT-PCR) was performed on 20 samples of patients with ulceronecrotic periodontitis and on 10 samples of healthy subjects. The estimation of total bacterial amount was based on gene copy number for 16S rRNA that was determined by comparing to Ct values/gene copy number of the standard curve. A statistically significant difference between average gene copy number of total bacteria in periodontal patients (2.55 x 10⁷) and healthy control (2.37 x 10⁶) was found (p = 0.01). Also, a trend of higher numbers of the gene copy in deeper periodontal lesions (> 7 mm) was confirmed by a positive value of coefficient of correlation (r = 0.073). The quantitative estimation of total bacteria based on gene copy number could be an important additional tool in diagnosing periodontitis.

  8. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  9. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.

  10. Dynamic inundation mapping of Hurricane Harvey flooding in the Houston metro area using hyper-resolution modeling and quantitative image reanalysis

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Lee, J. H.; Lee, S.; Zhang, Y.; Seo, D. J.

    2017-12-01

    Hurricane Harvey was one of the most extreme weather events in Texas history and left significant damages in the Houston and adjoining coastal areas. To understand better the relative impact to urban flooding of extreme amount and spatial extent of rainfall, unique geography, land use and storm surge, high-resolution water modeling is necessary such that natural and man-made components are fully resolved. In this presentation, we reconstruct spatiotemporal evolution of inundation during Hurricane Harvey using hyper-resolution modeling and quantitative image reanalysis. The two-dimensional urban flood model used is based on dynamic wave approximation and 10 m-resolution terrain data, and is forced by the radar-based multisensor quantitative precipitation estimates. The model domain includes Buffalo, Brays, Greens and White Oak Bayous in Houston. The model is simulated using hybrid parallel computing. To evaluate dynamic inundation mapping, we combine various qualitative crowdsourced images and video footages with LiDAR-based terrain data.

  11. Linking agent-based models and stochastic models of financial markets

    PubMed Central

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene

    2012-01-01

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  12. Breast cancer diagnosis using spatial light interference microscopy

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Kandel, Mikhail E.; Han, Kevin; Luo, Zelun; Macias, Virgilia; Tangella, Krishnarao; Balla, Andre; Popescu, Gabriel

    2015-11-01

    The standard practice in histopathology of breast cancers is to examine a hematoxylin and eosin (H&E) stained tissue biopsy under a microscope to diagnose whether a lesion is benign or malignant. This determination is made based on a manual, qualitative inspection, making it subject to investigator bias and resulting in low throughput. Hence, a quantitative, label-free, and high-throughput diagnosis method is highly desirable. We present here preliminary results showing the potential of quantitative phase imaging for breast cancer screening and help with differential diagnosis. We generated phase maps of unstained breast tissue biopsies using spatial light interference microscopy (SLIM). As a first step toward quantitative diagnosis based on SLIM, we carried out a qualitative evaluation of our label-free images. These images were shown to two pathologists who classified each case as either benign or malignant. This diagnosis was then compared against the diagnosis of the two pathologists on corresponding H&E stained tissue images and the number of agreements were counted. The agreement between SLIM and H&E based diagnosis was 88% for the first pathologist and 87% for the second. Our results demonstrate the potential and promise of SLIM for quantitative, label-free, and high-throughput diagnosis.

  13. Linking agent-based models and stochastic models of financial markets.

    PubMed

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  14. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  15. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  16. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    NASA Astrophysics Data System (ADS)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  17. An evaluation of performance-arts based HIV-prevention events in London with 13- 16-year-olds.

    PubMed

    Campbell, Tomás; Bath, Michael; Bradbear, Rachel; Cottle, Justine; Parrett, Neil

    2009-09-01

    The London borough of Newham is ethnically diverse and is one of the poorest regions in the UK. Rates of teenage pregnancy, sexually transmitted infections (STIs) and HIV are high compared to the rest of the country. One strand of the local school-based HIV-prevention programme for young people utilizes performance arts as a tool for HIV education and prevention. This study evaluated HIV knowledge, confidence and intention to use a condom in two groups of 13- 16-year-olds who had participated in performance-based events. Group 1 (n = 14) participated in a six-week programme of performance arts-based HIV education and prevention workshops, which culminated in a theatre-based performance. Group 2 (n = 65) were audience members who attended the performance. Participants completed a short questionnaire containing both qualitative and quantitative items. Qualitative data suggested that the participants had learned about condoms and their efficacy in preventing acquisition of HIV and sexually transmitted diseases. Quantitative results indicated that after participation in the events, respondents had more information about HIV and condom use; were more confident that they could insist on condom use with partners; and planned to use condoms in the future. There was a statistically significant difference between Groups 1 and 2 but because of the small numbers in Group 1 this result should be interpreted cautiously. Performance-based HIV-prevention activities may be a useful way to deliver HIV-prevention messages to young people. This evaluation will form the basis of a more systematic and robust evaluation of future events.

  18. Application of a SERS-based lateral flow immunoassay strip for the rapid and sensitive detection of staphylococcal enterotoxin B

    NASA Astrophysics Data System (ADS)

    Hwang, Joonki; Lee, Sangyeop; Choo, Jaebum

    2016-06-01

    A novel surface-enhanced Raman scattering (SERS)-based lateral flow immunoassay (LFA) biosensor was developed to resolve problems associated with conventional LFA strips (e.g., limits in quantitative analysis and low sensitivity). In our SERS-based biosensor, Raman reporter-labeled hollow gold nanospheres (HGNs) were used as SERS detection probes instead of gold nanoparticles. With the proposed SERS-based LFA strip, the presence of a target antigen can be identified through a colour change in the test zone. Furthermore, highly sensitive quantitative evaluation is possible by measuring SERS signals from the test zone. To verify the feasibility of the SERS-based LFA strip platform, an immunoassay of staphylococcal enterotoxin B (SEB) was performed as a model reaction. The limit of detection (LOD) for SEB, as determined with the SERS-based LFA strip, was estimated to be 0.001 ng mL-1. This value is approximately three orders of magnitude more sensitive than that achieved with the corresponding ELISA-based method. The proposed SERS-based LFA strip sensor shows significant potential for the rapid and sensitive detection of target markers in a simplified manner.A novel surface-enhanced Raman scattering (SERS)-based lateral flow immunoassay (LFA) biosensor was developed to resolve problems associated with conventional LFA strips (e.g., limits in quantitative analysis and low sensitivity). In our SERS-based biosensor, Raman reporter-labeled hollow gold nanospheres (HGNs) were used as SERS detection probes instead of gold nanoparticles. With the proposed SERS-based LFA strip, the presence of a target antigen can be identified through a colour change in the test zone. Furthermore, highly sensitive quantitative evaluation is possible by measuring SERS signals from the test zone. To verify the feasibility of the SERS-based LFA strip platform, an immunoassay of staphylococcal enterotoxin B (SEB) was performed as a model reaction. The limit of detection (LOD) for SEB, as determined with the SERS-based LFA strip, was estimated to be 0.001 ng mL-1. This value is approximately three orders of magnitude more sensitive than that achieved with the corresponding ELISA-based method. The proposed SERS-based LFA strip sensor shows significant potential for the rapid and sensitive detection of target markers in a simplified manner. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07243c

  19. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  20. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  1. Standardized protocols for quality control of MRM-based plasma proteomic workflows.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Smith, Derek S; Borchers, Christoph H

    2013-01-04

    Mass spectrometry (MS)-based proteomics is rapidly emerging as a viable technology for the identification and quantitation of biological samples, such as human plasma--the most complex yet commonly employed biofluid in clinical analyses. The transition from a qualitative to quantitative science is required if proteomics is going to successfully make the transition to a clinically useful technique. MS, however, has been criticized for a lack of reproducibility and interlaboratory transferability. Currently, the MS and plasma proteomics communities lack standardized protocols and reagents to ensure that high-quality quantitative data can be accurately and precisely reproduced by laboratories across the world using different MS technologies. Toward addressing this issue, we have developed standard protocols for multiple reaction monitoring (MRM)-based assays with customized isotopically labeled internal standards for quality control of the sample preparation workflow and the MS platform in quantitative plasma proteomic analyses. The development of reference standards and their application to a single MS platform is discussed herein, along with the results from intralaboratory tests. The tests highlighted the importance of the reference standards in assessing the efficiency and reproducibility of the entire bottom-up proteomic workflow and revealed errors related to the sample preparation and performance quality and deficits of the MS and LC systems. Such evaluations are necessary if MRM-based quantitative plasma proteomics is to be used in verifying and validating putative disease biomarkers across different research laboratories and eventually in clinical laboratories.

  2. [Modality of combined methods of quantitative and qualitative research in evaluation of therapeutic effects of Chinese medicine].

    PubMed

    Liu, Jian-ping

    2011-05-01

    The core of evidence-based medicine lies in implementing the current best available evidence of clinical research to direct the decision making in clinical practice, incorporation of individual experience and value and preference of patients. However, the current evaluation method for clinical therapeutic effect cannot reflect the humanity and wholesomeness as well as individualized tailored treatment of Chinese medicine (CM) by using randomized controlled trials. This assay addressed the complex intervention of highly individualized treatment of CM and its societal characteristics, and the author proposes a model for the evaluation of therapeutic effects of CM in which quantitative and qualitative methods are combined, embodying the characteristics of the social and natural sciences in CM. The model can show the dynamic process of CM diagnosis and treatment from a perspective of the whole system and can be used for the evaluation of complex intervention of CM. We hope to raise a different thinking and method from the new drug development in the therapeutic effect evaluation.

  3. [Urban ecological land in Changsha City: its quantitative analysis and optimization].

    PubMed

    Li, Xiao-Li; Zeng, Guang-Ming; Shi, Lin; Liang, Jie; Cai, Qing

    2010-02-01

    In this paper, a hierarchy index system suitable for catastrophe progression method was constructed to comprehensively analyze and evaluate the status of ecological land construction in Changsha City in 2007. Based on the evaluation results, the irrationalities of the distribution pattern of Changsha urban ecological land were discussed. With the support of geographic information system (GIS), the ecological corridors of the urban ecological land were constructed by using the 'least-cost' modeling, and, in combining with conflict analysis, the optimum project of the urban ecological land was put forward, forming an integrated evaluation system. The results indicated that the ecological efficiency of urban ecological land in Changsha in 2007 was at medium level, with an evaluation value being 0.9416, and the quantitative index being relatively high but the coordination index being relatively low. The analysis and verification with software Fragstats showed that the ecological efficiency of the urban ecological land after optimization was higher, with the evaluation value being 0.9618, and the SHDI, CONTAG, and other indices also enhanced.

  4. JPRS Report, Science & Technology, Japan, 4th Intelligent Robots Symposium, Volume 2

    DTIC Science & Technology

    1989-03-16

    accidents caused by strikes by robots,5 a quantitative model for safety evaluation,6 and evaluations of actual systems7 in order to contribute to...Mobile Robot Position Referencing Using Map-Based Vision Systems.... 160 Safety Evaluation of Man-Robot System 171 Fuzzy Path Pattern of Automatic...camera are made after the robot stops to prevent damage from occurring through obstacle interference. The position of the camera is indicated on the

  5. Marketable Job Skills for High School Students: What We Learned from an Evaluation of after School Matters

    ERIC Educational Resources Information Center

    Alexander, Kendra P.; Hirsch, Barton J.

    2012-01-01

    This article summarizes findings from an experimental evaluation of After School Matters (ASM), a paid, apprenticeship-based, after-school program in Chicago for high school students. Analysis of quantitative data from a mock job interview revealed that ASM participants did not demonstrate more marketable job skills than youth in the control…

  6. Preliminary ride-quality evaluation of the HM.2 Hoverferry

    NASA Technical Reports Server (NTRS)

    Mcclurken, E. W., Jr.; Jacobson, I. D.; Kuhlthau, A. R.

    1974-01-01

    The results of a forty-minute exposure of the HM.2 Hoverferry are presented. Quantitative evaluations were made from aft seats on the starboard side for a sea state considered calm and visually estimated at one-half to one foot. Since this type of craft is sensitive to sea state, the conclusions are based on ideal conditions. Some drawings are included.

  7. Preliminary results of real-time in-vitro electronic speckle pattern interferometry (ESPI) measurements in otolaryngology

    NASA Astrophysics Data System (ADS)

    Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.

    1995-05-01

    Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.

  8. Histomorphological evaluation of Compound bone of Granulated Ricinus in bone regeneration in rabbits

    NASA Astrophysics Data System (ADS)

    Pavan Mateus, Christiano; Orivaldo Chierice, Gilberto; Okamoto, Tetuo

    2011-09-01

    Histological evaluation is an effective method in the behavioral description of the qualitative and quantitative implanted materials. The research validated the performance of Compound bone of Granulated Ricinus on bone regeneration with the histomorphological analysis results. Were selected 30 rabbits, females, divided into 3 groups of 10 animals (G1, G2, G3) with a postoperative time of 45, 70 and 120 days respectively. Each animal is undergone 2 bone lesions in the ilium, one implemented in the material: Compound bone of Granulated Ricinus and the other for control. After the euthanasia, the iliac bone was removed, identified and subjected to histological procedure. The evaluation histological, histomorphological results were interpreted and described by quantitative and qualitative analysis based facts verified in the three experimental groups evaluating the rate of absorption of the material in the tissue regeneration, based on the neo-bone formation. The histomorphologic results classified as a material biocompatible and biologically active. Action in regeneration by bone resorption occurs slowly and gradually. Knowing the time and rate of absorption and neo-formation bone biomaterial, which can be determined in the bone segment applicable in the clinical surgical area.

  9. Hey girlfriend: an evaluation of AIDS prevention among women in the sex industry.

    PubMed

    Dorfman, L E; Derish, P A; Cohen, J B

    1992-01-01

    Increasingly, acquired immunodeficiency syndrome (AIDS) prevention programs have been developed to reach and influence street-based populations. Standard methods of evaluation do not fit the conditions of such programs. This article describes a process and outcome evaluation of an AIDS prevention program for sex workers in which qualitative and quantitative methods were combined in order to mediate research problems endemic to street-based populations. Methods included epidemiological questionnaires, open-ended interviews with participants, and ethnographic field notes. Process evaluation findings show that field staff who were indigenous to the neighborhood and population readily gained access to the community of sex workers and simultaneously became role models for positive behavior change. Outcome findings show that sex workers do feel at risk for AIDS, but usually from clients rather than from husbands or boyfriends. Accordingly, they use condoms more frequently with clients than with steady partners. Increasing condom use among sex workers with their steady partners remains an important challenge for AIDS prevention. Combining qualitative and quantitative research data provided a more comprehensive assessment of how to reach sex workers with effective AIDS risk reduction messages than either method could have provided alone.

  10. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  11. Applying national survey results for strategic planning and program improvement: the National Diabetes Education Program.

    PubMed

    Griffey, Susan; Piccinino, Linda; Gallivan, Joanne; Lotenberg, Lynne Doner; Tuncer, Diane

    2015-02-01

    Since the 1970s, the federal government has spearheaded major national education programs to reduce the burden of chronic diseases in the United States. These prevention and disease management programs communicate critical information to the public, those affected by the disease, and health care providers. The National Diabetes Education Program (NDEP), the leading federal program on diabetes sponsored by the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC), uses primary and secondary quantitative data and qualitative audience research to guide program planning and evaluation. Since 2006, the NDEP has filled the gaps in existing quantitative data sources by conducting its own population-based survey, the NDEP National Diabetes Survey (NNDS). The NNDS is conducted every 2–3 years and tracks changes in knowledge, attitudes and practice indicators in key target audiences. This article describes how the NDEP has used the NNDS as a key component of its evaluation framework and how it applies the survey results for strategic planning and program improvement. The NDEP's use of the NNDS illustrates how a program evaluation framework that includes periodic population-based surveys can serve as an evaluation model for similar national health education programs.

  12. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.

  13. Ultrasound arthroscopy of human knee cartilage and subchondral bone in vivo.

    PubMed

    Liukkonen, Jukka; Lehenkari, Petri; Hirvasniemi, Jukka; Joukainen, Antti; Virén, Tuomas; Saarakkala, Simo; Nieminen, Miika T; Jurvelin, Jukka S; Töyräs, Juha

    2014-09-01

    Arthroscopic ultrasound imaging enables quantitative evaluation of articular cartilage. However, the potential of this technique for evaluation of subchondral bone has not been investigated in vivo. In this study, we address this issue in clinical arthroscopy of the human knee (n = 11) by determining quantitative ultrasound (9 MHz) reflection and backscattering parameters for cartilage and subchondral bone. Furthermore, in each knee, seven anatomical sites were graded using the International Cartilage Repair Society (ICRS) system based on (i) conventional arthroscopy and (ii) ultrasound images acquired in arthroscopy with a miniature transducer. Ultrasound enabled visualization of articular cartilage and subchondral bone. ICRS grades based on ultrasound images were higher (p < 0.05) than those based on conventional arthroscopy. The higher ultrasound-based ICRS grades were expected as ultrasound reveals additional information on, for example, the relative depth of the lesion. In line with previous literature, ultrasound reflection and scattering in cartilage varied significantly (p < 0.05) along the ICRS scale. However, no significant correlation between ultrasound parameters and structure or density of subchondral bone could be demonstrated. To conclude, arthroscopic ultrasound imaging had a significant effect on clinical grading of cartilage, and it was found to provide quantitative information on cartilage. The lack of correlation between the ultrasound parameters and bone properties may be related to lesser bone change or excessive attenuation in overlying cartilage and insufficient power of the applied miniature transducer. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  14. Performance of human fecal anaerobe-associated PCR-based assays in a multi-laboratory method evaluation study

    USGS Publications Warehouse

    Layton, Blythe A.; Cao, Yiping; Ebentier, Darcy L.; Hanley, Kaitlyn; Ballesté, Elisenda; Brandão, João; Byappanahalli, Muruleedhara N.; Converse, Reagan; Farnleitner, Andreas H.; Gentry-Shields, Jennifer; Gourmelon, Michèle; Lee, Chang Soo; Lee, Jiyoung; Lozach, Solen; Madi, Tania; Meijer, Wim G.; Noble, Rachel; Peed, Lindsay; Reischer, Georg H.; Rodrigues, Raquel; Rose, Joan B.; Schriewer, Alexander; Sinigalliano, Chris; Srinivasan, Sangeetha; Stewart, Jill; ,; Laurie, C.; Wang, Dan; Whitman, Richard; Wuertz, Stefan; Jay, Jenny; Holden, Patricia A.; Boehm, Alexandria B.; Shanks, Orin; Griffith, John F.

    2013-01-01

    A number of PCR-based methods for detecting human fecal material in environmental waters have been developed over the past decade, but these methods have rarely received independent comparative testing in large multi-laboratory studies. Here, we evaluated ten of these methods (BacH, BacHum-UCD, Bacteroides thetaiotaomicron (BtH), BsteriF1, gyrB, HF183 endpoint, HF183 SYBR, HF183 Taqman®, HumM2, and Methanobrevibacter smithii nifH (Mnif)) using 64 blind samples prepared in one laboratory. The blind samples contained either one or two fecal sources from human, wastewater or non-human sources. The assay results were assessed for presence/absence of the human markers and also quantitatively while varying the following: 1) classification of samples that were detected but not quantifiable (DNQ) as positive or negative; 2) reference fecal sample concentration unit of measure (such as culturable indicator bacteria, wet mass, total DNA, etc); and 3) human fecal source type (stool, sewage or septage). Assay performance using presence/absence metrics was found to depend on the classification of DNQ samples. The assays that performed best quantitatively varied based on the fecal concentration unit of measure and laboratory protocol. All methods were consistently more sensitive to human stools compared to sewage or septage in both the presence/absence and quantitative analysis. Overall, HF183 Taqman® was found to be the most effective marker of human fecal contamination in this California-based study.

  15. A diagnostic system for articular cartilage using non-destructive pulsed laser irradiation.

    PubMed

    Sato, Masato; Ishihara, Miya; Kikuchi, Makoto; Mochida, Joji

    2011-07-01

    Osteoarthritis involves dysfunction caused by cartilage degeneration, but objective evaluation methodologies based on the original function of the articular cartilage remain unavailable. Evaluations for osteoarthritis are mostly based simply on patient symptoms or the degree of joint space narrowing on X-ray images. Accurate measurement and quantitative evaluation of the mechanical characteristics of the cartilage is important, and the tissue properties of the original articular cartilage must be clarified to understand the pathological condition in detail and to correctly judge the efficacy of treatment. We have developed new methods to measure some essential properties of cartilage: a photoacoustic measurement method; and time-resolved fluorescence spectroscopy. A nanosecond-pulsed laser, which is completely non-destructive, is focused onto the target cartilage and induces a photoacoustic wave that will propagate with attenuation and is affected by the viscoelasticity of the surrounding cartilage. We also investigated whether pulsed laser irradiation and the measurement of excited autofluorescence allow real-time, non-invasive evaluation of tissue characteristics. The decay time, during which the amplitude of the photoacoustic wave is reduced by a factor of 1/e, represents the key numerical value used to characterize and evaluate the viscoelasticity and rheological behavior of the cartilage. Our findings show that time-resolved laser-induced autofluorescence spectroscopy (TR-LIFS) is useful for evaluating tissue-engineered cartilage. Photoacoustic measurement and TR-LIFS, predicated on the interactions between optics and living organs, is a suitable methodology for diagnosis during arthroscopy, allowing quantitative and multidirectional evaluation of the original function of the cartilage based on a variety of parameters. Copyright © 2011 Wiley-Liss, Inc.

  16. Radar-derived quantitative precipitation estimation in complex terrain over the eastern Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Gou, Yabin; Ma, Yingzhao; Chen, Haonan; Wen, Yixin

    2018-05-01

    Quantitative precipitation estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex spatial and temporal variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3264 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profile of reflectivity (VPR) clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method for all precipitation events in terms of score comparison using validation gauge measurements as references. It is also found that the SCIT-based approach can effectively mitigate the local error of radar QPE and represent the precipitation spatiotemporal variability better than the RT-based scheme.

  17. Comparison of Pre-Service Physics Teachers' Conceptual Understanding of Dynamics in Model-Based Scientific Inquiry and Scientific Inquiry Environments

    ERIC Educational Resources Information Center

    Arslan Buyruk, Arzu; Ogan Bekiroglu, Feral

    2018-01-01

    The focus of this study was to evaluate the impact of model-based inquiry on pre-service physics teachers' conceptual understanding of dynamics. Theoretical framework of this research was based on models-of-data theory. True-experimental design using quantitative and qualitative research methods was carried out for this research. Participants of…

  18. Multi-frequency local wavenumber analysis and ply correlation of delamination damage.

    PubMed

    Juarez, Peter D; Leckey, Cara A C

    2015-09-01

    Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.

  19. Communication among neurons.

    PubMed

    Marner, Lisbeth

    2012-04-01

    The communication among neurons is the prerequisite for the working brain. To understand the cellular, neurochemical, and structural basis of this communication, and the impacts of aging and disease on brain function, quantitative measures are necessary. This thesis evaluates several quantitative neurobiological methods with respect to possible bias and methodological issues. Stereological methods are suited for the unbiased estimation of number, length, and volumes of components of the nervous system. Stereological estimates of the total length of myelinated nerve fibers were made in white matter of post mortem brains, and the impact of aging and diseases as Schizophrenia and Alzheimer's disease were evaluated. Although stereological methods are in principle unbiased, shrinkage artifacts are difficult to account for. Positron emission tomography (PET) recordings, in conjunction with kinetic modeling, permit the quantitation of radioligand binding in brain. The novel serotonin 5-HT4 antagonist [11C]SB207145 was used as an example of the validation process for quantitative PET receptor imaging. Methods based on reference tissue as well as methods based on an arterial plasma input function were evaluated with respect to precision and accuracy. It was shown that [11C]SB207145 binding had high sensitivity to occupancy by unlabeled ligand, necessitating high specific activity in the radiosynthesis to avoid bias. The established serotonin 5-HT2A ligand [18F]altanersin was evaluated in a two-year follow-up study in elderly subjects. Application of partial volume correction of the PET data diminished the reliability of the measures, but allowed for the correct distinction between changes due to brain atrophy and receptor availability. Furthermore, a PET study of patients with Alzheimer's disease with the serotonin transporter ligand [11C]DASB showed relatively preserved serotonergic projections, despite a marked decrease in 5-HT2A receptor binding. Possible confounders are considered and the relation to the prevailing beta-amyloid hypothesis is discussed.

  20. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-04-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.

  1. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. The challenges of quantitative evaluation of a multi-setting, multi-strategy community-based childhood obesity prevention programme: lessons learnt from the eat well be active Community Programs in South Australia.

    PubMed

    Wilson, Annabelle M; Magarey, Anthea M; Dollman, James; Jones, Michelle; Mastersson, Nadia

    2010-08-01

    To describe the rationale, development and implementation of the quantitative component of evaluation of a multi-setting, multi-strategy, community-based childhood obesity prevention project (the eat well be active (ewba) Community Programs) and the challenges associated with this process and some potential solutions. ewba has a quasi-experimental design with intervention and comparison communities. Baseline data were collected in 2006 and post-intervention measures will be taken from a non-matched cohort in 2009. Schoolchildren aged 10-12 years were chosen as one litmus group for evaluation purposes. Thirty-nine primary schools in two metropolitan and two rural communities in South Australia. A total of 1732 10-12-year-old school students completed a nutrition and/or a physical activity questionnaire and 1637 had anthropometric measures taken; 983 parents, 286 teachers, thirty-six principals, twenty-six canteen and thirteen out-of-school-hours care (OSHC) workers completed Program-specific questionnaires developed for each of these target groups. The overall child response rate for the study was 49 %. Sixty-five per cent, 43 %, 90 %, 90 % and 68 % of parent, teachers, principals, canteen and OSHC workers respectively, completed and returned questionnaires. A number of practical, logistical and methodological challenges were experienced when undertaking this data collection. Learnings from the process of quantitative baseline data collection for the ewba Community Programs can provide insights for other researchers planning similar studies with similar methods, particularly those evaluating multi-strategy programmes across multiple settings.

  3. Using multiple methods to assess learning and outcomes in an online degree-granting dental hygiene program.

    PubMed

    Springfield, Emily; Gwozdek, Anne E; Peet, Melissa; Kerschbaum, Wendy E

    2012-04-01

    Program evaluation is a necessary component of curricular change and innovation. It ascertains whether an innovation has met benchmarks and contributes to the body of knowledge about educational methodologies and supports the use of evidence-based practice in teaching. Education researchers argue that rigorous program evaluation should utilize a mixed-method approach, triangulating both qualitative and quantitative methods to understand program effectiveness. This approach was used to evaluate the University of Michigan Dental Hygiene Degree Completion E-Learning (online) Program. Quantitative data included time spent on coursework, grades, publications, course evaluation results, and survey responses. Qualitative data included student and faculty responses in focus groups and on surveys as well as students' portfolio reflections. The results showed the program was academically rigorous, fostering students' ability to connect theory with practice and apply evidence-based practice principles. These results also demonstrated that the students had learned to critically reflect on their practice and develop expanded professional identities; going beyond the role of clinician, they began to see themselves as educators, advocates, and researchers. This evaluation model is easily adaptable and is applicable to any health science or other professional degree program. This study also raised important questions regarding the effect of meta-reflection on student confidence and professional behavior.

  4. Classroom versus Computer-Based CPR Training: A Comparison of the Effectiveness of Two Instructional Methods

    ERIC Educational Resources Information Center

    Rehberg, Robb S.; Gazzillo Diaz, Linda; Middlemas, David A.

    2009-01-01

    Objective: The objective of this study was to determine whether computer-based CPR training is comparable to traditional classroom training. Design and Setting: This study was quantitative in design. Data was gathered from a standardized examination and skill performance evaluation which yielded numerical scores. Subjects: The subjects were 64…

  5. User and System-Based Quality Criteria for Evaluating Information Resources and Services Available from Federal Websites: Final Report.

    ERIC Educational Resources Information Center

    Wyman, Steven K.; And Others

    This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…

  6. A comparison of region-based and pixel-based CEUS kinetics parameters in the assessment of arthritis

    NASA Astrophysics Data System (ADS)

    Grisan, E.; Raffeiner, B.; Coran, A.; Rizzo, G.; Ciprian, L.; Stramare, R.

    2014-03-01

    Inflammatory rheumatic diseases are leading causes of disability and constitute a frequent medical disorder, leading to inability to work, high comorbidity and increased mortality. The gold-standard for diagnosing and differentiating arthritis is based on patient conditions and radiographic findings, as joint erosions or decalcification. However, early signs of arthritis are joint effusion, hypervascularization and synovial hypertrophy. In particular, vascularization has been shown to correlate with arthritis' destructive behavior, more than clinical assessment. Contrast Enhanced Ultrasound (CEUS) examination of the small joints is emerging as a sensitive tool for assessing vascularization and disease activity. The evaluation of perfusion pattern rely on subjective semi-quantitative scales, that are able to capture the macroscopic degree of vascularization, but are unable to detect the subtler differences in kinetics perfusion parameters that might lead to a deeper understanding of disease progression and a better management of patients. Quantitative assessment is mostly performed by means of the Qontrast software package, that requires the user to define a region of interest, whose mean intensity curve is fitted with an exponential function. We show that using a more physiologically motivated perfusion curve, and by estimating the kinetics parameters separately pixel per pixel, the quantitative information gathered is able to differentiate more effectively different perfusion patterns. In particular, we will show that a pixel-based analysis is able to provide significant markers differentiating rheumatoid arthritis from simil-rheumatoid psoriatic arthritis, that have non-significant differences in clinical evaluation (DAS28), serological markers, or region-based parameters.

  7. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.

  8. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  9. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  10. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    PubMed

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  11. Detection and differentiation of early acute and following age stages of myocardial infarction with quantitative post-mortem cardiac 1.5T MR.

    PubMed

    Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J; Schuster, Frederick; Riva, Fabiano; Zech, Wolf-Dieter

    2017-01-01

    Recently, quantitative MR sequences have started being used in post-mortem imaging. The goal of the present study was to evaluate if early acute and following age stages of myocardial infarction can be detected and discerned by quantitative 1.5T post-mortem cardiac magnetic resonance (PMCMR) based on quantitative T1, T2 and PD values. In 80 deceased individuals (25 female, 55 male), a cardiac MR quantification sequence was performed prior to cardiac dissection at autopsy in a prospective study. Focal myocardial signal alterations detected in synthetically generated MR images were MR quantified for their T1, T2 and PD values. The locations of signal alteration measurements in PMCMR were targeted at autopsy heart dissection and cardiac tissue specimens were taken for histologic examinations. Quantified signal alterations in PMCMR were correlated to their according histologic age stage of myocardial infarction. In PMCMR seventy-three focal myocardial signal alterations were detected in 49 of 80 investigated hearts. These signal alterations were diagnosed histologically as early acute (n=39), acute (n=14), subacute (n=10) and chronic (n=10) age stages of myocardial infarction. Statistical analysis revealed that based on their quantitative T1, T2 and PD values, a significant difference between all defined age groups of myocardial infarction can be determined. It can be concluded that quantitative 1.5T PMCMR quantification based on quantitative T1, T2 and PD values is feasible for characterization and differentiation of early acute and following age stages of myocardial infarction. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria by gas chromatography-mass spectrometry.

    PubMed

    Guan, Wenna; Zhao, Hui; Lu, Xuefeng; Wang, Cong; Yang, Menglong; Bai, Fali

    2011-11-11

    Simple and rapid quantitative determination of fatty-acid-based biofuels is greatly important for the study of genetic engineering progress for biofuels production by microalgae. Ideal biofuels produced from biological systems should be chemically similar to petroleum, like fatty-acid-based molecules including free fatty acids, fatty acid methyl esters, fatty acid ethyl esters, fatty alcohols and fatty alkanes. This study founded a gas chromatography-mass spectrometry (GC-MS) method for simultaneous quantification of seven free fatty acids, nine fatty acid methyl esters, five fatty acid ethyl esters, five fatty alcohols and three fatty alkanes produced by wild-type Synechocystis PCC 6803 and its genetically engineered strain. Data obtained from GC-MS analyses were quantified using internal standard peak area comparisons. The linearity, limit of detection (LOD) and precision (RSD) of the method were evaluated. The results demonstrated that fatty-acid-based biofuels can be directly determined by GC-MS without derivation. Therefore, rapid and reliable quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria can be achieved using the GC-MS method founded in this work. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  14. The Mathematics of a Successful Deconvolution: A Quantitative Assessment of Mixture-Based Combinatorial Libraries Screened Against Two Formylpeptide Receptors

    PubMed Central

    Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia

    2014-01-01

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730

  15. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.

  16. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  17. Quantitative analysis of 18F-NaF dynamic PET/CT cannot differentiate malignant from benign lesions in multiple myeloma

    PubMed Central

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-01-01

    A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUVaverage for MM lesions was 11.9 and mean SUVmax was 23.2. Respectively, SUVaverage and SUVmax for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18F-NaF revealed the following mean values for MM lesions: K1 = 0.248 (1/min), k3 = 0.359 (1/min), influx (Ki) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K1 = 0.169 (1/min), k3 = 0.422 (1/min), influx (Ki) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUVaverage, SUVmax, K1, k3 and influx (Ki) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18F-NaF PET/CT in the diagnostic workup of MM. PMID:28913153

  18. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. A online credit evaluation method based on AHP and SPA

    NASA Astrophysics Data System (ADS)

    Xu, Yingtao; Zhang, Ying

    2009-07-01

    Online credit evaluation is the foundation for the establishment of trust and for the management of risk between buyers and sellers in e-commerce. In this paper, a new credit evaluation method based on the analytic hierarchy process (AHP) and the set pair analysis (SPA) is presented to determine the credibility of the electronic commerce participants. It solves some of the drawbacks found in classical credit evaluation methods and broadens the scope of current approaches. Both qualitative and quantitative indicators are considered in the proposed method, then a overall credit score is achieved from the optimal perspective. In the end, a case analysis of China Garment Network is provided for illustrative purposes.

  1. Allocation of Load-Loss Cost Caused by Voltage Sag

    NASA Astrophysics Data System (ADS)

    Gao, X.

    2017-10-01

    This paper focuses on the allocation of load-loss cost caused by voltage sag in the environment of electricity market. To compensate the loss of loads due to voltage sags, the load-loss cost is allocated to both sources and power consumers. On the basis of Load Drop Cost (LDC), a quantitative evaluation index of load-loss cost caused by voltage sag is identified. The load-loss cost to be allocated to power consumers themselves is calculated according to load classification. Based on the theory of power component the quantitative relation between sources and loads is established, thereby a quantitative calculation method for load-loss cost allocated to each source is deduced and the quantitative compensation from individual source to load is proposed. A simple five-bus system illustrates the main features of the proposed method.

  2. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  3. Contrast-enhanced magnetic resonance imaging of pulmonary lesions: description of a technique aiming clinical practice.

    PubMed

    Koenigkam-Santos, Marcel; Optazaite, Elzbieta; Sommer, Gregor; Safi, Seyer; Heussel, Claus Peter; Kauczor, Hans-Ulrich; Puderbach, Michael

    2015-01-01

    To propose a technique for evaluation of pulmonary lesions using contrast-enhanced MRI; to assess morphological patterns of enhancement and correlate quantitative analysis with histopathology. Thirty-six patients were prospectively studied. Volumetric-interpolated T1W images were obtained during consecutive breath holds after bolus triggered contrast injection. Volume coverage of first three acquisitions was limited (higher temporal resolution) and last acquisition obtained at 4th min. Two radiologists individually evaluated the patterns of enhancement. Region-of-interest-based signal intensity (SI)-time curves were created to assess quantitative parameters. Readers agreed moderately to substantially concerning lesions' enhancement pattern. SI-time curves could be created for all lesions. In comparison to benign, malignant lesions showed higher values of maximum enhancement, early peak, slope and 4th min enhancement. Early peak >15% showed 100% sensitivity to detect malignancy, maximum enhancement >40% showed 100% specificity. The proposed technique is robust, simple to perform and can be applied in clinical scenario. It allows visual evaluation of enhancement pattern/progression together with creation of SI-time curves and assessment of derived quantitative parameters. Perfusion analysis was highly sensitive to detect malignancy, in accordance to what is recommended by most recent guidelines on imaging evaluation of pulmonary lesions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Living specimen tomography by digital holographic microscopy: morphometry of testate amoeba

    NASA Astrophysics Data System (ADS)

    Charrière, Florian; Pavillon, Nicolas; Colomb, Tristan; Depeursinge, Christian; Heger, Thierry J.; Mitchell, Edward A. D.; Marquet, Pierre; Rappaz, Benjamin

    2006-08-01

    This paper presents an optical diffraction tomography technique based on digital holographic microscopy. Quantitative 2-dimensional phase images are acquired for regularly-spaced angular positions of the specimen covering a total angle of π, allowing to built 3-dimensional quantitative refractive index distributions by an inverse Radon transform. A 20x magnification allows a resolution better than 3 μm in all three dimensions, with accuracy better than 0.01 for the refractive index measurements. This technique is for the first time to our knowledge applied to living specimen (testate amoeba, Protista). Morphometric measurements are extracted from the tomographic reconstructions, showing that the commonly used method for testate amoeba biovolume evaluation leads to systematic under evaluations by about 50%.

  5. Polarization variations in installed fibers and their influence on quantum key distribution systems.

    PubMed

    Ding, Yu-Yang; Chen, Hua; Wang, Shuang; He, De-Yong; Yin, Zhen-Qiang; Chen, Wei; Zhou, Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2017-10-30

    Polarization variations in the installed fibers are complex and volatile, and would severely affect the performances of polarization-sensitive quantum key distribution (QKD) systems. Based on the recorded data about polarization variations of different installed fibers, we establish an analytical methodology to quantitatively evaluate the influence of polarization variations on polarization-sensitive QKD systems. Using the increased quantum bit error rate induced by polarization variations as a key criteria, we propose two parameters - polarization drift time and required tracking speed - to characterize polarization variations. For field buried and aerial fibers with different length, we quantitatively evaluate the influence of polarization variations, and also provide requirements and suggestions for polarization basis alignment modules of QKD systems deployed in different kind of fibers.

  6. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography.

    PubMed

    Montanini, R; Freni, F; Rossi, G L

    2012-09-01

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  8. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  9. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  10. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  11. Microelectrode-based technology for the detection of low levels of bacteria

    NASA Technical Reports Server (NTRS)

    Rogers, Tom D.; Hitchens, G. D.; Mishra, S. K.; Pierson, D. L.

    1992-01-01

    A microelectrode-based electrochemical detection method was used for quantitation of bacteria in water samples. The redox mediator, benzoquinone, was used to accept electrons from the bacterial metabolic pathway to create a flow of electrons by reducing the mediator. Electrochemical monitoring electrodes detected the reduced mediator as it diffused out of the cells and produced a small electrical current. By using a combination of microelectrodes and monitoring instrumentation, the cumulative current generated by a particular bacterial population could be monitored. Using commercially available components, an electrochemical detection system was assembled and tested to evaluate its potential as an emerging technology for rapid detection and quantitation of bacteria in water samples.

  12. Evaluative judgments are based on evaluative information: Evidence against meaning change in evaluative context effects.

    PubMed

    Kaplan, M F

    1975-07-01

    Trait adjectives commonly employed in person perception studies have both evaluative and denotative meanings. Evaluative ratings of single traits shift with variations in the context of other traits ascribed to the stimulus person; the extent to which denotative changes underlie these evaluative context effects has been a theoretical controversy. In the first experiment, it was shown that context effects on quantitative ratings of denotation can be largely accounted for by evaluative halo effects. In the second experiment, increasing the denotative relatedness of context traits to the test trait didnot increase the effect of the context. Only the evaluative meaning of the context affected evaluation of the rated test trait. These studies suggest that the denotative relationship between a test adjective and its context has little influence on context effects in person perception, and that denotative meaning changes do not mediate context effects. Instead, evaluative judgments appear to be based on evaluative meaning.

  13. Use of Nucleic Acid-Based Tools for Monitoring Biostimulation and Bioaugmentation

    DTIC Science & Technology

    2011-01-01

    dechlorination is a promising process for biodegradation of chlorinated solvents. The successful field evaluation and implementation of the reductive...These specialized bacteria use the chlorinated ethenes as electron acceptors and gain energy for growth from the reductive dechlorination reactions...protocol addresses the use of MBTs to quantitatively assess the Dhc population at chlorinated ethene sites and aims at providing guidance to evaluate

  14. Monte Carlo evaluation of accuracy and noise properties of two scatter correction methods for /sup 201/Tl cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.

    1997-12-01

    Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.

  15. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Quantification of EEG reactivity in comatose patients

    PubMed Central

    Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas

    2016-01-01

    Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757

  17. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  18. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  19. Evaluation of viral removal by nanofiltration using real-time quantitative polymerase chain reaction.

    PubMed

    Zhao, Xiaowen; Bailey, Mark R; Emery, Warren R; Lambooy, Peter K; Chen, Dayue

    2007-06-01

    Nanofiltration is commonly introduced into purification processes of biologics produced in mammalian cells to serve as a designated step for removal of potential exogenous viral contaminants and endogenous retrovirus-like particles. The LRV (log reduction value) achieved by nanofiltration is often determined by cell-based infectivity assay, which is time-consuming and labour-intensive. We have explored the possibility of employing QPCR (quantitative PCR) to evaluate LRV achieved by nanofiltration in scaled-down studies using two model viruses, namely xenotropic murine leukemia virus and murine minute virus. We report here the successful development of a QPCR-based method suitable for quantification of virus removal by nanofiltration. The method includes a nuclease treatment step to remove free viral nucleic acids, while viral genome associated with intact virus particles is shielded from the nuclease. In addition, HIV Armored RNA was included as an internal control to ensure the accuracy and reliability of the method. The QPCRbased method described here provides several advantages such as better sensitivity, faster turnaround time, reduced cost and higher throughput over the traditional cell-based infectivity assays.

  20. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  1. EVALUATING DISCONTINUITIES IN COMPLEX SYSTEMS: TOWARD QUANTITATIVE MEASURE OF RESILIENCE

    EPA Science Inventory

    The textural discontinuity hypothesis (TDH) is based on the observation that animal body mass distributions exhibit discontinuities that may reflect the texture of the landscape available for exploitation. This idea has been extended to other complex systems, hinting that the ide...

  2. [Advantages and disadvantages of incorporating qualitative methodology in the evaluation of health services. A practical case: evaluation of a high-resolution clinic].

    PubMed

    Alvarez Del Arco, D; Rodríguez Rieiro, C; Sanchidrián De Blás, C; Alejos, B; Plá Mestre, R

    2012-01-01

    We examined the usefulness of incorporating a qualitative phase in the evaluation of the quality of care in a high-resolution medical service carried out with quantitative methods. A quantitative research was performed using a structured questionnaire and selecting interviewees by systematic randomized sampling methods (n=320). In addition, a qualitative research was carried on through semi-structured interviews with patients selected by convenience criteria (n=11), observations in the care assistance circuit, and a group interview with health professionals working in the service. A multidisciplinary research team conducted an individual analysis of the information collected in both quantitative and qualitative phases. Subsequently, three meetings based on group brainstorming techniques were held to identify the diverse contributions of each of the methodologies employed to the research, using affinity graphs to analyse the different results obtained in both phases and evaluate possible bias arising from the use of qualitative methods. Qualitative research allowed examining specific aspects of the health care service that had been collected in the quantitative phase, harmonizing the results obtained in the previous phase, giving in-depth data on the reasons for patient dissatisfaction with specific aspects, such as waiting times and available infrastructures, and identifying emerging issues of the service which had not been previously assessed. Overall, the qualitative phase enriched the results of the research. It is appropriate and recommendable to incorporate this methodological approach in research aimed at evaluating the quality of the service in specific health care settings, since it is provided first hand, by the voice of the customer. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  3. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    PubMed

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.

  5. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  6. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  7. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  8. The Effects of Activity-Based Elementary Science Programs on Student Outcomes and Classroom Practices: A Meta Analysis of Controlled Studies.

    ERIC Educational Resources Information Center

    Bredderman, Ted

    A quantitative synthesis of research findings on the effects of three major activity-based elementary science programs developed with National Science Foundation support was conducted. Controlled evaluation studies of the Elementary Science Study (ESS), Science-A Process Approach (SAPA), or The Science Curriculum Improvement Study (SCIS) were used…

  9. Integrated and Contextual Basic Science Instruction in Preclinical Education: Problem-Based Learning Experience Enriched with Brain/Mind Learning Principles

    ERIC Educational Resources Information Center

    Gülpinar, Mehmet Ali; Isoglu-Alkaç, Ümmühan; Yegen, Berrak Çaglayan

    2015-01-01

    Recently, integrated and contextual learning models such as problem-based learning (PBL) and brain/mind learning (BML) have become prominent. The present study aimed to develop and evaluate a PBL program enriched with BML principles. In this study, participants were 295 first-year medical students. The study used both quantitative and qualitative…

  10. A Meta-Analysis of School-Based Interventions Aimed to Prevent or Reduce Violence in Teen Dating Relationships

    ERIC Educational Resources Information Center

    De La Rue, Lisa; Polanin, Joshua R.; Espelage, Dorothy L.; Pigott, Terri D.

    2017-01-01

    The incidence of violence in dating relationships has a significant impact on young people, including decreased mental and physical health. This review is the first to provide a quantitative synthesis of empirical evaluations of school-based programs implemented in middle and high schools that sought to prevent or reduce incidents of dating…

  11. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  12. Identify Structural Flaw Location and Type with an Inverse Algorithm of Resonance Inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Wei; Lai, Canhai; Sun, Xin

    To evaluate the fitness-for-service of a structural component and to quantify its remaining useful life, aging and service-induced structural flaws must be quantitatively determined in service or during scheduled maintenance shutdowns. Resonance inspection (RI), a non-destructive evaluation (NDE) technique, distinguishes the anomalous parts from the good parts based on changes in the natural frequency spectra. Known for its numerous advantages, i.e., low inspection cost, high testing speed, and broad applicability to complex structures, RI has been widely used in the automobile industry for quality inspection. However, compared to other contemporary direct visualization-based NDE methods, a more widespread application of RImore » faces a fundamental challenge because such technology is unable to quantify the flaw details, e.g. location, dimensions, and types. In this study, the applicability of a maximum correlation-based inverse RI algorithm developed by the authors is further studied for various flaw cases. It is demonstrated that a variety of common structural flaws, i.e. stiffness degradation, voids, and cracks, can be accurately retrieved by this algorithm even when multiple different types of flaws coexist. The quantitative relations between the damage identification results and the flaw characteristics are also developed to assist the evaluation of the actual state of health of the engineering structures.« less

  13. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  14. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  15. Development of cell-based quantitative evaluation method for cell cycle-arrest type cancer drugs for apoptosis by high precision surface plasmon resonance sensor

    NASA Astrophysics Data System (ADS)

    Ona, Toshihiro; Nishijima, Hiroshi; Kosaihira, Atsushi; Shibata, Junko

    2008-04-01

    In vitro rapid and quantitative cell-based assay is demanded to verify the efficacy prediction of cancer drugs since a cancer patient may have unconventional aspects of tumor development. Here, we show the rapid and non-label quantitative verifying method and instrumentation of apoptosis for cell cycle-arrest type cancer drugs (Roscovitine and D-allose) by reaction analysis of living liver cancer cells cultured on a sensor chip with a newly developed high precision (50 ndeg s -1 average fluctuation) surface plasmon resonance (SPR) sensor. The time-course cell reaction as the SPR angle change rate for 10 min from 30 min cell culture with a drug was significantly related to cell viability. By the simultaneous detection of differential SPR angle change and fluorescence by specific probes using the new instrument, the SPR angle was related to the nano-order potential decrease in inner mitochondrial membrane potential. The results obtained are universally valid for the cell cycle-arrest type cancer drugs, which mediate apoptosis through different cell-signaling pathways, by a liver cancer cell line of Hep G2 (P<0.001). This system towards the application to evaluate personal therapeutic potentials of drugs using cancer cells from patients in clinical use.

  16. Digital evaluation of absolute marginal discrepancy: A comparison of ceramic crowns fabricated with conventional and digital techniques.

    PubMed

    Liang, Shanshan; Yuan, Fusong; Luo, Xu; Yu, Zhuoren; Tang, Zhihui

    2018-04-05

    Marginal discrepancy is key to evaluating the accuracy of fixed dental prostheses. An improved method of evaluating marginal discrepancy is needed. The purpose of this in vitro study was to evaluate the absolute marginal discrepancy of ceramic crowns fabricated using conventional and digital methods with a digital method for the quantitative evaluation of absolute marginal discrepancy. The novel method was based on 3-dimensional scanning, iterative closest point registration techniques, and reverse engineering theory. Six standard tooth preparations for the right maxillary central incisor, right maxillary second premolar, right maxillary second molar, left mandibular lateral incisor, left mandibular first premolar, and left mandibular first molar were selected. Ten conventional ceramic crowns and 10 CEREC crowns were fabricated for each tooth preparation. A dental cast scanner was used to obtain 3-dimensional data of the preparations and ceramic crowns, and the data were compared with the "virtual seating" iterative closest point technique. Reverse engineering software used edge sharpening and other functional modules to extract the margins of the preparations and crowns. Finally, quantitative evaluation of the absolute marginal discrepancy of the ceramic crowns was obtained from the 2-dimensional cross-sectional straight-line distance between points on the margin of the ceramic crowns and the standard preparations based on the circumferential function module along the long axis. The absolute marginal discrepancy of the ceramic crowns fabricated using conventional methods was 115 ±15.2 μm, and 110 ±14.3 μm for those fabricated using the digital technique was. ANOVA showed no statistical difference between the 2 methods or among ceramic crowns for different teeth (P>.05). The digital quantitative evaluation method for the absolute marginal discrepancy of ceramic crowns was established. The evaluations determined that the absolute marginal discrepancies were within a clinically acceptable range. This method is acceptable for the digital evaluation of the accuracy of complete crowns. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  17. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  18. Evaluating Academic Scientists Collaborating in Team-Based Research: A Proposed Framework

    PubMed Central

    Mazumdar, Madhu; Messinger, Shari; Finkelstein, Dianne M.; Goldberg, Judith D.; Lindsell, Christopher J.; Morton, Sally C.; Pollock, Brad H.; Rahbar, Mohammad H.; Welty, Leah J.; Parker, Robert A.

    2015-01-01

    Criteria for evaluating faculty are traditionally based on a triad of scholarship, teaching, and service. Research scholarship is often measured by first or senior authorship on peer-reviewed scientific publications and being principal investigator on extramural grants. Yet scientific innovation increasingly requires collective rather than individual creativity, which traditional measures of achievement were not designed to capture and, thus, devalue. The authors propose a simple, flexible framework for evaluating team scientists that includes both quantitative and qualitative assessments. An approach for documenting contributions of team scientists in team-based scholarship, non-traditional education, and specialized service activities is also outlined. While biostatisticians are used for illustration, the approach is generalizable to team scientists in other disciplines. PMID:25993282

  19. Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.

    PubMed

    Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D

    2015-01-01

    Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.

  20. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery

    NASA Astrophysics Data System (ADS)

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C.; Christ, Clara D.; DesJarlais, Renee L.; Duca, Jose S.; Lewis, Richard A.; Loughney, Deborah A.; Manas, Eric S.; McGaughey, Georgia B.; Peishoff, Catherine E.; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  1. Quantitative Evaluation of Third Year Medical Students' Perception and Satisfaction from Problem Based Learning in Anatomy: A Pilot Study of the Introduction of Problem Based Learning into the Traditional Didactic Medical Curriculum in Nigeria

    ERIC Educational Resources Information Center

    Saalu, L. C.; Abraham A. A.; Aina, W. O.

    2010-01-01

    Problem-based learning (PBL) is a method of teaching that uses hypothetical clinical cases, individual investigation and group process. In recent years, in medical education, problem-based learning (PBL) has increasingly been adopted as the preferred pedagogy in many countries around the world. Controversy, however, still exists as the potential…

  2. Characterization and quantitation of polyolefin microplastics in personal-care products using high-temperature gel-permeation chromatography.

    PubMed

    Hintersteiner, Ingrid; Himmelsbach, Markus; Buchberger, Wolfgang W

    2015-02-01

    In recent years, the development of reliable methods for the quantitation of microplastics in different samples, including evaluating the particles' adverse effects in the marine environment, has become a great concern. Because polyolefins are the most prevalent type of polymer in personal-care products containing microplastics, this study presents a novel approach for their quantitation. The method is suitable for aqueous and hydrocarbon-based products, and includes a rapid sample clean-up involving twofold density separation and a subsequent quantitation with high-temperature gel-permeation chromatography. In contrast with previous procedures, both errors caused by weighing after insufficient separation of plastics and matrix and time-consuming visual sorting are avoided. In addition to reliable quantitative results, in this investigation a comprehensive characterization of the polymer particles isolated from the product matrix, covering size, shape, molecular weight distribution and stabilization, is provided. Results for seven different personal-care products are presented. Recoveries of this method were in the range of 92-96 %.

  3. Dynamic calibration approach for determining catechins and gallic acid in green tea using LC-ESI/MS.

    PubMed

    Bedner, Mary; Duewer, David L

    2011-08-15

    Catechins and gallic acid are antioxidant constituents of Camellia sinensis, or green tea. Liquid chromatography with both ultraviolet (UV) absorbance and electrospray ionization mass spectrometric (ESI/MS) detection was used to determine catechins and gallic acid in three green tea matrix materials that are commonly used as dietary supplements. The results from both detection modes were evaluated with 14 quantitation models, all of which were based on the analyte response relative to an internal standard. Half of the models were static, where quantitation was achieved with calibration factors that were constant over an analysis set. The other half were dynamic, with calibration factors calculated from interpolated response factor data at each time a sample was injected to correct for potential variations in analyte response over time. For all analytes, the relatively nonselective UV responses were found to be very stable over time and independent of the calibrant concentration; comparable results with low variability were obtained regardless of the quantitation model used. Conversely, the highly selective MS responses were found to vary both with time and as a function of the calibrant concentration. A dynamic quantitation model based on polynomial data-fitting was used to reduce the variability in the quantitative results using the MS data.

  4. Epitope mapping and targeted quantitation of the cardiac biomarker troponin by SID-MRM mass spectrometry.

    PubMed

    Zhao, Cheng; Trudeau, Beth; Xie, Helen; Prostko, John; Fishpaugh, Jeffrey; Ramsay, Carol

    2014-06-01

    The absolute quantitation of the targeted protein using MS provides a promising method to evaluate/verify biomarkers used in clinical diagnostics. In this study, a cardiac biomarker, troponin I (TnI), was used as a model protein for method development. The epitope peptide of TnI was characterized by epitope excision followed with LC/MS/MS method and acted as the surrogate peptide for the targeted protein quantitation. The MRM-based MS assay using a stable internal standard that improved the selectivity, specificity, and sensitivity of the protein quantitation. Also, plasma albumin depletion and affinity enrichment of TnI by anti-TnI mAb-coated microparticles reduced the sample complexity, enhanced the dynamic range, and further improved the detecting sensitivity of the targeted protein in the biological matrix. Therefore, quantitation of TnI, a low abundant protein in human plasma, has demonstrated the applicability of the targeted protein quantitation strategy through its epitope peptide determined by epitope mapping method. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. RELATIVE MOLDINESS INDEX© AS PREDICTOR OF CHILDHOOD RESPIRATORY ILLNESS

    EPA Science Inventory

    The results of a traditional visual mold inspection were compared to a mold evaluation based on the Relative Moldiness Index (RMI). The RMI is calculated from mold specific quantitative PCR (MSQPCR) measurements of the concentation of 36 species of molds in floor dust samples. ...

  6. High content screening of ToxCast compounds using Vala Sciences’ complex cell culturing systems (SOT)

    EPA Science Inventory

    US EPA’s ToxCast research program evaluates bioactivity for thousands of chemicals utilizing high-throughput screening assays to inform chemical testing decisions. Vala Sciences provides high content, multiplexed assays that utilize quantitative cell-based digital image analysis....

  7. Assessing genomic selection prediction accuracy in a dynamic barley breeding

    USDA-ARS?s Scientific Manuscript database

    Genomic selection is a method to improve quantitative traits in crops and livestock by estimating breeding values of selection candidates using phenotype and genome-wide marker data sets. Prediction accuracy has been evaluated through simulation and cross-validation, however validation based on prog...

  8. A Mixed Methods Approach to Evaluate Partnerships and Implementation of the Massachusetts Prevention and Wellness Trust Fund.

    PubMed

    Lee, Rebekka M; Ramanadhan, Shoba; Kruse, Gina R; Deutsch, Charles

    2018-01-01

    Background: Strong partnerships are critical to integrate evidence-based prevention interventions within clinical and community-based settings, offering multilevel and sustainable solutions to complex health issues. As part of Massachusetts' 2012 health reform, The Prevention and Wellness Trust Fund (PWTF) funded nine local partnerships throughout the state to address hypertension, pediatric asthma, falls among older adults, and tobacco use. The initiative was designed to improve health outcomes through prevention and disease management strategies and reduce healthcare costs. Purpose: Describe the mixed-methods study design for investigating PWTF implementation. Methods: The Consolidated Framework for Implementation Research guided the development of this evaluation. First, the study team conducted semi-structured qualitative interviews with leaders from each of nine partnerships to document partnership development and function, intervention adaptation and delivery, and the influence of contextual factors on implementation. The interview findings were used to develop a quantitative survey to assess the implementation experiences of 172 staff from clinical and community-based settings and a social network analysis to assess changes in the relationships among 72 PWTF partner organizations. The quantitative survey data on ratings of perceived implementation success were used to purposively select 24 staff for interviews to explore the most successful experiences of implementing evidence-based interventions for each of the four conditions. Conclusions: This mixed-methods approach for evaluation of implementation of evidence-based prevention interventions by PWTF partnerships can help decision-makers set future priorities for implementing and assessing clinical-community partnerships focused on prevention.

  9. Quantitative probe of the transition metal redox in battery electrodes through soft x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli

    2016-10-01

    Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.

  10. A Caco-2 cell-based quantitative antioxidant activity assay for antioxidants.

    PubMed

    Wan, Hongxia; Liu, Dong; Yu, Xiangying; Sun, Haiyan; Li, Yan

    2015-05-15

    A Caco-2 cell-based antioxidant activity (CAA) assay for quantitative evaluation of antioxidants was developed by optimizing seeding density and culture time of Caco-2 cells, incubation time and concentration of fluorescent probe (2',7'-dichlorofluorescin diacetate, DCFH-DA), incubation way and incubation time of antioxidants (pure phytochemicals) and DCFH-DA with cells, and detection time of fluorescence. Results showed that the CAA assay was of good reproducibility and could be used to evaluate the antioxidant activity of antioxidants at the following conditions: seeding density of 5 × 10(4)/well, cell culture time of 24h, co-incubation of 60 μM DCFH-DA and pure phytochemicals with Caco-2 cells for 20 min and fluorescence recorded for 90 min. Additionally, a significant correlation was observed between CAA values and rat plasma ORAC values following the intake of antioxidants for selected pure phytochemicals (R(2) = 0.815, p < 0.01), demonstrating the good biological relevance of CAA assay. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A 96-well-plate-based optical method for the quantitative and qualitative evaluation of Pseudomonas aeruginosa biofilm formation and its application to susceptibility testing.

    PubMed

    Müsken, Mathias; Di Fiore, Stefano; Römling, Ute; Häussler, Susanne

    2010-08-01

    A major reason for bacterial persistence during chronic infections is the survival of bacteria within biofilm structures, which protect cells from environmental stresses, host immune responses and antimicrobial therapy. Thus, there is concern that laboratory methods developed to measure the antibiotic susceptibility of planktonic bacteria may not be relevant to chronic biofilm infections, and it has been suggested that alternative methods should test antibiotic susceptibility within a biofilm. In this paper, we describe a fast and reliable protocol for using 96-well microtiter plates for the formation of Pseudomonas aeruginosa biofilms; the method is easily adaptable for antimicrobial susceptibility testing. This method is based on bacterial viability staining in combination with automated confocal laser scanning microscopy. The procedure simplifies qualitative and quantitative evaluation of biofilms and has proven to be effective for standardized determination of antibiotic efficiency on P. aeruginosa biofilms. The protocol can be performed within approximately 60 h.

  12. Two-phase designs for joint quantitative-trait-dependent and genotype-dependent sampling in post-GWAS regional sequencing.

    PubMed

    Espin-Garcia, Osvaldo; Craiu, Radu V; Bull, Shelley B

    2018-02-01

    We evaluate two-phase designs to follow-up findings from genome-wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation-maximization-based inference under a semiparametric maximum likelihood formulation tailored for post-GWAS inference. A GWAS-SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT-SNP-dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme-QT strata yields significant power improvements compared to marginal QT- or SNP-based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. © 2017 The Authors. Genetic Epidemiology Published by Wiley Periodicals, Inc.

  13. Simultaneous fingerprint, quantitative analysis and anti-oxidative based screening of components in Rhizoma Smilacis Glabrae using liquid chromatography coupled with Charged Aerosol and Coulometric array Detection.

    PubMed

    Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong

    2017-04-01

    An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Evaluation of three techniques for classifying urban land cover patterns using LANDSAT MSS data. [New Orleans, Louisiana

    NASA Technical Reports Server (NTRS)

    Baumann, P. R. (Principal Investigator)

    1979-01-01

    Three computer quantitative techniques for determining urban land cover patterns are evaluated. The techniques examined deal with the selection of training samples by an automated process, the overlaying of two scenes from different seasons of the year, and the use of individual pixels as training points. Evaluation is based on the number and type of land cover classes generated and the marks obtained from an accuracy test. New Orleans, Louisiana and its environs form the study area.

  15. [Simultaneous quantitative analysis of five alkaloids in Sophora flavescens by multi-components assay by single marker].

    PubMed

    Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang

    2013-05-01

    To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.

  16. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  17. Quantitative multi-modality imaging analysis of a bioabsorbable poly-L-lactic acid stent design in the acute phase: a comparison between 2- and 3D-QCA, QCU and QMSCT-CA.

    PubMed

    Bruining, Nico; Tanimoto, Shuzou; Otsuka, Masato; Weustink, Annick; Ligthart, Jurgen; de Winter, Sebastiaan; van Mieghem, Carlos; Nieman, Koen; de Feyter, Pim J; van Domburg, Ron T; Serruys, Patrick W

    2008-08-01

    To investigate if three-dimensional (3D) based quantitative techniques are comparable to each other and to explore possible differences with respect to the reference method of 2D-QCA in the acute phase and to study whether non-invasive MSCT could potentially be applied to quantify luminal dimensions of a stented coronary segment with a novel bioabsorable drug-eluting stent made of poly-l-lactic-acid (PLLA). Quantitative imaging data derived from 16 patients enrolled at our institution in a first-in-man trial (ABSORB) receiving a biodegradable stent and who were imaged with standard coronary angiography and intravascular ultrasound were compared. Shortly, after stenting the patients also underwent a MSCT procedure. Standard 2D-QCA showed significant smaller stent lengths (p < 0.01). Although, the absolute measured stent diameters and areas by 2D-QCA tend to be smaller, the differences failed to be statistically different when compared to the 3D based quantitative modalities. Measurements made by non-invasive QMSCT-CA of implanted PLLA stents appeared to be comparable to the other 3D modalities without significant differences. Three-dimensional based quantitative analyses showed similar results quantifying luminal dimensions as compared to 2D-QCA during an evaluation of a new bioabsorbable coronary stent design in the acute phase. Furthermore, in biodegradable stents made of PLLA, non-invasive QMSCT-CA can be used to quantify luminal dimensions.

  18. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  19. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE PAGES

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...

    2016-06-22

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  20. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  1. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  2. Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging

    PubMed Central

    Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel

    2014-01-01

    Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701

  3. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    NASA Astrophysics Data System (ADS)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  4. A Hospice Rotation for Military Medical Residents: A Mixed Methods, Multi-Perspective Program Evaluation

    PubMed Central

    Boyden, Jackelyn Y.; Kalish, Virginia B.; Muir, J. Cameron; Richardson, Suzanne; Connor, Stephen R.

    2016-01-01

    Abstract Background: An estimated 6,000 to 18,000 additional hospice and palliative medicine (HPM) physicians are needed in the United States. A source could be the military graduate medical education system where 15% of U.S. medical residents are trained. A community-based hospice and palliative care organization created a one-week rotation for military residents including participation in interdisciplinary group visits at patients' homes, facilities, and an inpatient hospice unit. Objective: Our goal was to evaluate the effectiveness of a one-week community HPM rotation for military medical residents. Methods: A mixed-methods, multi-stakeholder perspective program evaluation model was used for program years 2011 to 2013. Data were managed and analyzed using Microsoft Excel and Atlas.ti. Participants in the rotation were residents training at two local military hospitals. Program evaluation data were collected from residents, military program liaisons, and hospice clinical preceptors. Quantitative data included pre- and post-tests based on Accreditation Council for Graduate Medical Education competencies completed by residents. Qualitative data included resident essays and semi-structured interviews with hospice preceptors and military program liaisons. Results: Quantitative and qualitative data suggested that the rotation increased military residents' knowledge, attitudes, and comfort level with HPM. Quantitative analysis of test scores indicated improvements from pre- to post-tests in each of five areas of learning. Qualitative data indicated the rotation created a greater appreciation for the overall importance of HPM and increased understanding of eligibility and methods for pain and symptom management. Conclusions: A one-week community hospice rotation for medical military residents impacts participant's knowledge of and attitudes toward HPM. PMID:27139524

  5. Emergy evaluation of water utilization benefits in water-ecological-economic system based on water cycle process

    NASA Astrophysics Data System (ADS)

    Guo, X.; Wu, Z.; Lv, C.

    2017-12-01

    The water utilization benefits are formed by the material flow, energy flow, information flow and value stream in the whole water cycle process, and reflected along with the material circulation of inner system. But most of traditional water utilization benefits evaluation are based on the macro level, only consider the whole material input and output and energy conversion relation, and lack the characterization of water utilization benefits accompanying with water cycle process from the formation mechanism. In addition, most studies are from the perspective of economics, only pay attention to the whole economic output and sewage treatment economic investment, but neglect the ecological function benefits of water cycle, Therefore, from the perspective of internal material circulation in the whole system, taking water cycle process as the process of material circulation and energy flow, the circulation and flow process of water and other ecological environment, social economic elements were described, and the composition of water utilization positive and negative benefits in water-ecological-economic system was explored, and the performance of each benefit was analyzed. On this basis, the emergy calculation method of each benefit was proposed by emergy quantitative analysis technique, which can realize the unified measurement and evaluation of water utilization benefits in water-ecological-economic system. Then, taking Zhengzhou city as an example, the corresponding benefits of different water cycle links were calculated quantitatively by emergy method, and the results showed that the emergy evaluation method of water utilization benefits can unify the ecosystem and the economic system, achieve uniform quantitative analysis, and measure the true value of natural resources and human economic activities comprehensively.

  6. The dynamic micro computed tomography at SSRF

    NASA Astrophysics Data System (ADS)

    Chen, R.; Xu, L.; Du, G.; Deng, B.; Xie, H.; Xiao, T.

    2018-05-01

    Synchrotron radiation micro-computed tomography (SR-μCT) is a critical technique for quantitative characterizing the 3D internal structure of samples, recently the dynamic SR-μCT has been attracting vast attention since it can evaluate the three-dimensional structure evolution of a sample. A dynamic μCT method, which is based on monochromatic beam, was developed at the X-ray Imaging and Biomedical Application Beamline at Shanghai Synchrotron Radiation Facility, by combining the compressed sensing based CT reconstruction algorithm and hardware upgrade. The monochromatic beam based method can achieve quantitative information, and lower dose than the white beam base method in which the lower energy beam is absorbed by the sample rather than contribute to the final imaging signal. The developed method is successfully used to investigate the compression of the air sac during respiration in a bell cricket, providing new knowledge for further research on the insect respiratory system.

  7. A quantitative swab is a good non-invasive alternative to a quantitative biopsy for quantifying bacterial load in wounds healing by second intention in horses.

    PubMed

    Van Hecke, L L; Hermans, K; Haspeslagh, M; Chiers, K; Pint, E; Boyen, F; Martens, A M

    2017-07-01

    The aim of this study was to evaluate different techniques for diagnosing wound infection in wounds healing by second intention in horses and to assess the effect of a vortex and sonication protocol on quantitative bacteriology in specimens with a histologically confirmed biofilm. In 50 wounds healing by second intention, a clinical assessment, a quantitative swab, a semi-quantitative swab, and a swab for cytology were compared to a quantitative tissue biopsy (reference standard). Part of the biopsy specimen was examined histologically for evidence of a biofilm. There was a significant, high correlation (P<0.001; r=0.747) between the outcome of the quantitative swabs and the quantitative biopsies. The semi-quantitative swabs showed a significant, moderate correlation with the quantitative biopsies (P<0.001; ρ=0.524). Higher white blood cell counts for cytology were significantly associated with lower log 10 colony-forming units (CFU) in the wounds (P=0.02). Wounds with black granulation tissue showed significantly higher log 10 CFU (P=0.003). Specimens with biofilms did not yield higher bacteriological counts after a vortex and sonication protocol was performed to release bacteria from the biofilm. Based on these findings, a quantitative swab is an acceptable non-invasive alternative to a quantitative biopsy for quantifying bacterial load in equine wounds healing by second intention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Effectiveness of inquiry-based learning in an undergraduate exercise physiology course.

    PubMed

    Nybo, Lars; May, Michael

    2015-06-01

    The present study was conducted to investigate the effects of changing a laboratory physiology course for undergraduate students from a traditional step-by-step guided structure to an inquiry-based approach. With this aim in mind, quantitative and qualitative evaluations of learning outcomes (individual subject-specific tests and group interviews) were performed for a laboratory course in cardiorespiratory exercise physiology that was conducted in one year with a traditional step-by-step guided manual (traditional course) and the next year completed with an inquiry-based structure (I-based course). The I-based course was a guided inquiry course where students had to design the experimental protocol and conduct their own study on the basis of certain predefined criteria (i.e., they should evaluate respiratory responses to submaximal and maximal exercise and provide indirect and direct measures of aerobic exercise capacity). The results indicated that the overall time spent on the experimental course as well as self-evaluated learning outcomes were similar across groups. However, students in the I-based course used more time in preparation (102 ± 5 min) than students in the traditional course (42 ± 3 min, P < 0.05), and 65 ± 5% students in the I-based course searched for additional literature before experimentation compared with only 2 ± 1% students in the traditional course. Furthermore, students in the I-based course achieved a higher (P < 0.05) average score on the quantitative test (45 ± 3%) compared with students in the traditional course (31 ± 4%). Although students were unfamiliar with cardiorespiratory exercise physiology and the experimental methods before the course, it appears that an inquiry-based approach rather than one that provides students with step-by-step instructions may benefit learning outcomes in a laboratory physiology course. Copyright © 2015 The American Physiological Society.

  9. Quantitative assessment of tumour extraction from dermoscopy images and evaluation of computer-based extraction methods for an automatic melanoma diagnostic system.

    PubMed

    Iyatomi, Hitoshi; Oka, Hiroshi; Saito, Masataka; Miyake, Ayako; Kimoto, Masayuki; Yamagami, Jun; Kobayashi, Seiichiro; Tanikawa, Akiko; Hagiwara, Masafumi; Ogawa, Koichi; Argenziano, Giuseppe; Soyer, H Peter; Tanaka, Masaru

    2006-04-01

    The aims of this study were to provide a quantitative assessment of the tumour area extracted by dermatologists and to evaluate computer-based methods from dermoscopy images for refining a computer-based melanoma diagnostic system. Dermoscopic images of 188 Clark naevi, 56 Reed naevi and 75 melanomas were examined. Five dermatologists manually drew the border of each lesion with a tablet computer. The inter-observer variability was evaluated and the standard tumour area (STA) for each dermoscopy image was defined. Manual extractions by 10 non-medical individuals and by two computer-based methods were evaluated with STA-based assessment criteria: precision and recall. Our new computer-based method introduced the region-growing approach in order to yield results close to those obtained by dermatologists. The effectiveness of our extraction method with regard to diagnostic accuracy was evaluated. Two linear classifiers were built using the results of conventional and new computer-based tumour area extraction methods. The final diagnostic accuracy was evaluated by drawing the receiver operating curve (ROC) of each classifier, and the area under each ROC was evaluated. The standard deviations of the tumour area extracted by five dermatologists and 10 non-medical individuals were 8.9% and 10.7%, respectively. After assessment of the extraction results by dermatologists, the STA was defined as the area that was selected by more than two dermatologists. Dermatologists selected the melanoma area with statistically smaller divergence than that of Clark naevus or Reed naevus (P = 0.05). By contrast, non-medical individuals did not show this difference. Our new computer-based extraction algorithm showed superior performance (precision, 94.1%; recall, 95.3%) to the conventional thresholding method (precision, 99.5%; recall, 87.6%). These results indicate that our new algorithm extracted a tumour area close to that obtained by dermatologists and, in particular, the border part of the tumour was adequately extracted. With this refinement, the area under the ROC increased from 0.795 to 0.875 and the diagnostic accuracy showed an increase of approximately 20% in specificity when the sensitivity was 80%. It can be concluded that our computer-based tumour extraction algorithm extracted almost the same area as that obtained by dermatologists and provided improved computer-based diagnostic accuracy.

  10. Effect of biofilm formation, and biocorrosion on denture base fractures.

    PubMed

    Sahin, Cem; Ergin, Alper; Ayyildiz, Simel; Cosgun, Erdal; Uzun, Gulay

    2013-05-01

    The aim of this study was to investigate the destructive effects of biofilm formation and/or biocorrosive activity of 6 different oral microorganisms. Three different heat polymerized acrylic resins (Ivocap Plus, Lucitone 550, QC 20) were used to prepare three different types of samples. Type "A" samples with "V" type notch was used to measure the fracture strength, "B" type to evaluate the surfaces with scanning electron microscopy and "C" type for quantitative biofilm assay. Development and calculation of biofilm covered surfaces on denture base materials were accomplished by SEM and quantitative biofilm assay. According to normality assumptions ANOVA or Kruskal-Wallis was selected for statistical analysis (α=0.05). Significant differences were obtained among the adhesion potential of 6 different microorganisms and there were significant differences among their adhesion onto 3 different denture base materials. Compared to the control groups after contamination with the microorganisms, the three point bending test values of denture base materials decreased significantly (P<.05); microorganisms diffused at least 52% of the denture base surface. The highest median quantitative biofilm value within all the denture base materials was obtained with P. aeruginosa on Lucitone 550. The type of denture base material did not alter the diffusion potential of the microorganisms significantly (P>.05). All the tested microorganisms had destructive effect over the structure and composition of the denture base materials.

  11. Effect of biofilm formation, and biocorrosion on denture base fractures

    PubMed Central

    Ergin, Alper; Ayyildiz, Simel; Cosgun, Erdal; Uzun, Gulay

    2013-01-01

    PURPOSE The aim of this study was to investigate the destructive effects of biofilm formation and/or biocorrosive activity of 6 different oral microorganisms. MATERIALS AND METHODS Three different heat polymerized acrylic resins (Ivocap Plus, Lucitone 550, QC 20) were used to prepare three different types of samples. Type "A" samples with "V" type notch was used to measure the fracture strength, "B" type to evaluate the surfaces with scanning electron microscopy and "C" type for quantitative biofilm assay. Development and calculation of biofilm covered surfaces on denture base materials were accomplished by SEM and quantitative biofilm assay. According to normality assumptions ANOVA or Kruskal-Wallis was selected for statistical analysis (α=0.05). RESULTS Significant differences were obtained among the adhesion potential of 6 different microorganisms and there were significant differences among their adhesion onto 3 different denture base materials. Compared to the control groups after contamination with the microorganisms, the three point bending test values of denture base materials decreased significantly (P<.05); microorganisms diffused at least 52% of the denture base surface. The highest median quantitative biofilm value within all the denture base materials was obtained with P. aeruginosa on Lucitone 550. The type of denture base material did not alter the diffusion potential of the microorganisms significantly (P>.05). CONCLUSION All the tested microorganisms had destructive effect over the structure and composition of the denture base materials. PMID:23755339

  12. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  13. WE-AB-207A-12: HLCC Based Quantitative Evaluation Method of Image Artifact in Dental CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y; Wu, S; Qi, H

    Purpose: Image artifacts are usually evaluated qualitatively via visual observation of the reconstructed images, which is susceptible to subjective factors due to the lack of an objective evaluation criterion. In this work, we propose a Helgason-Ludwig consistency condition (HLCC) based evaluation method to quantify the severity level of different image artifacts in dental CBCT. Methods: Our evaluation method consists of four step: 1) Acquire Cone beam CT(CBCT) projection; 2) Convert 3D CBCT projection to fan-beam projection by extracting its central plane projection; 3) Convert fan-beam projection to parallel-beam projection utilizing sinogram-based rebinning algorithm or detail-based rebinning algorithm; 4) Obtain HLCCmore » profile by integrating parallel-beam projection per view and calculate wave percentage and variance of the HLCC profile, which can be used to describe the severity level of image artifacts. Results: Several sets of dental CBCT projections containing only one type of artifact (i.e. geometry, scatter, beam hardening, lag and noise artifact), were simulated using gDRR, a GPU tool developed for efficient, accurate, and realistic simulation of CBCT Projections. These simulated CBCT projections were used to test our proposed method. HLCC profile wave percentage and variance induced by geometry distortion are about 3∼21 times and 16∼393 times as large as that of the artifact-free projection, respectively. The increase factor of wave percentage and variance are 6 and133 times for beam hardening, 19 and 1184 times for scatter, and 4 and16 times for lag artifacts, respectively. In contrast, for noisy projection the wave percentage, variance and inconsistency level are almost the same with those of the noise-free one. Conclusion: We have proposed a quantitative evaluation method of image artifact based on HLCC theory. According to our simulation results, the severity of different artifact types is found to be in a following order: Scatter>Geometry>Beam hardening>Lag>Noise>Artifact-free in dental CBCT.« less

  14. 3D-quantitative structure-activity relationship study for the design of novel enterovirus A71 3C protease inhibitors.

    PubMed

    Nie, Quandeng; Xu, Xiaoyi; Zhang, Qi; Ma, Yuying; Yin, Zheng; Shang, Luqing

    2018-06-07

    A three-dimensional quantitative structure-activity relationships model of enterovirus A71 3C protease inhibitors was constructed in this study. The protein-ligand interaction fingerprint was analyzed to generate a pharmacophore model. A predictive and reliable three-dimensional quantitative structure-activity relationships model was built based on the Flexible Alignment of AutoGPA. Moreover, three novel compounds (I-III) were designed and evaluated for their biochemical activity against 3C protease and anti-enterovirus A71 activity in vitro. III exhibited excellent inhibitory activity (IC 50 =0.031 ± 0.005 μM, EC 50 =0.036 ± 0.007 μM). Thus, this study provides a useful quantitative structure-activity relationships model to develop potent inhibitors for enterovirus A71 3C protease. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Modeling and Development of INS-Aided PLLs in a GNSS/INS Deeply-Coupled Hardware Prototype for Dynamic Applications

    PubMed Central

    Zhang, Tisheng; Niu, Xiaoji; Ban, Yalong; Zhang, Hongping; Shi, Chuang; Liu, Jingnan

    2015-01-01

    A GNSS/INS deeply-coupled system can improve the satellite signals tracking performance by INS aiding tracking loops under dynamics. However, there was no literature available on the complete modeling of the INS branch in the INS-aided tracking loop, which caused the lack of a theoretical tool to guide the selections of inertial sensors, parameter optimization and quantitative analysis of INS-aided PLLs. This paper makes an effort on the INS branch in modeling and parameter optimization of phase-locked loops (PLLs) based on the scalar-based GNSS/INS deeply-coupled system. It establishes the transfer function between all known error sources and the PLL tracking error, which can be used to quantitatively evaluate the candidate inertial measurement unit (IMU) affecting the carrier phase tracking error. Based on that, a steady-state error model is proposed to design INS-aided PLLs and to analyze their tracking performance. Based on the modeling and error analysis, an integrated deeply-coupled hardware prototype is developed, with the optimization of the aiding information. Finally, the performance of the INS-aided PLLs designed based on the proposed steady-state error model is evaluated through the simulation and road tests of the hardware prototype. PMID:25569751

  16. Radar-derived Quantitative Precipitation Estimation in Complex Terrain over the Eastern Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Gou, Y.

    2017-12-01

    Quantitative Precipitation Estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex space time variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3294 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profiles of reflectivity clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method in all precipitation events in terms of score comparison using validation gauge measurements as references, with higher correlation (than 75.74%), lower mean absolute error (than 82.38%) and root-mean-square error (than 89.04%) of all the comparative frames. It is also found that the SCIT-based approach can effectively mitigate the radar QPE local error and represent precipitation spatiotemporal variability better than RT-based scheme.

  17. Quantitative description of solid breast nodules by ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Sehgal, Chandra M.; Kangas, Sarah A.; Cary, Ted W.; Weinstein, Susan P.; Schultz, Susan M.; Arger, Peter H.; Conant, Emily F.

    2004-04-01

    Various features based on qualitative description of shape, contour, margin and echogenicity of solid breast nodules are used clinically to classify them as benign or malignant. However, there continues to be considerable overlap in the sonographic findings for the two types of lesions. This is related to the lack of precise definition of the various features as well as to the lack of agreement among observers, among other factors. The goal of this investigation is to define clinical features quantitatively and evaluate if they differ significantly in malignant and benign cases. Features based on margin sharpness and continuity, shadowing, and attenuation were defined and calculated from the images. These features were tested on digital phantoms. Following the evaluation, the features were measured on 116 breast sonograms of 58 biopsy-proven masses. Biopsy had been recommended for all of these breast lesions based on physical exams and conventional diagnostic imaging of ultrasound and/or mammography. Of the 58 masses, 20 were identified as malignant and 38 as benign histologically. Margin sharpness, margin echogenicity, and angular margin variation were significantly different for the two groups (p<0.03, two-tailed student t-test). Shadowing and attenuation of ultrasound did not show significant difference. The results of this preliminary study show that quantitative margin characteristics measured for the malignant and benign masses from the ultrasound images are different and could potentially be useful in identifying a subgroup of solid breast nodules that have low risk of being malignant.

  18. Quantitative evaluation research of glare from automotive headlamps

    NASA Astrophysics Data System (ADS)

    Wang, Tiecheng; Qian, Rui; Cao, Ye; Gao, Mingqiu

    2018-01-01

    This study concerns the quantized evaluation research of glare from automotive headlamps. In the actual regulations, only one point in the test screen is set for judging whether driver can bear the light caused by headlamps of opposing vehicle. To evaluating practical effect of glare, we accept a glare zone with the probability distribution information of the oncoming driver's eye position. In this focus area, glare level of headlamp is represented by weighted luminous flux. To confirm the most comfortable illuminance value to human eyes at 50 m, we used test point B50L as observation position, and collected 1,000 subjective evaluation data from 20 test personnel in different ages during two months. Basing on the assessment results, we calculated 0.60 lx as recommended value for standardized testing procedure at 25 m. Then we figured out 0.38 lm as optimum value, and 0.25 / 1.20 lm as limiting values depending on regulations. We tested 40 sample vehicles with different levels to verify the sectional nonlinear quantitative evaluation method we designed, and analyzed the typical test results.

  19. Analysis and Evaluation of Processes and Equipment in Tasks 2 and 4 of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Wolf, M.

    1979-01-01

    To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.

  20. An Evaluation of Automotive Interior Packages Based on Human Ocular and Joint Motor Properties

    NASA Astrophysics Data System (ADS)

    Tanaka, Yoshiyuki; Rakumatsu, Takeshi; Horiue, Masayoshi; Miyazaki, Tooru; Nishikawa, Kazuo; Nouzawa, Takahide; Tsuji, Toshio

    This paper proposes a new evaluation method of an automotive interior package based on human oculomotor and joint-motor properties. Assuming the long-term driving situation in the express high way, the three evaluation indices were designed on i) the ratio of head motion at gazing the driving items; ii) the load torque for maintaining the standard driving posture; and iii) the human force manipulability at the end-point of human extremities. Experiments were carried out for two different interior packages with four subjects who have the special knowledge on the automobile development. Evaluation results demonstrate that the proposed method can quantitatively analyze the driving interior in good agreement with the generally accepted subjective opinion in the automobile industry.

  1. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies.

    PubMed

    Silva-Rodríguez, Jesús; Aguiar, Pablo; Sánchez, Manuel; Mosquera, Javier; Luna-Vega, Víctor; Cortés, Julia; Garrido, Miguel; Pombar, Miguel; Ruibal, Alvaro

    2014-05-01

    Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manual ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.

  2. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva-Rodríguez, Jesús, E-mail: jesus.silva.rodriguez@sergas.es; Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es; Servicio de Medicina Nuclear, Complexo Hospitalario Universidade de Santiago de Compostela

    Purpose: Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. Methods: One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manualmore » ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Results: Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. Conclusions: The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.« less

  3. Optimization of a shorter variable-acquisition time for legs to achieve true whole-body PET/CT images.

    PubMed

    Umeda, Takuro; Miwa, Kenta; Murata, Taisuke; Miyaji, Noriaki; Wagatsuma, Kei; Motegi, Kazuki; Terauchi, Takashi; Koizumi, Mitsuru

    2017-12-01

    The present study aimed to qualitatively and quantitatively evaluate PET images as a function of acquisition time for various leg sizes, and to optimize a shorter variable-acquisition time protocol for legs to achieve better qualitative and quantitative accuracy of true whole-body PET/CT images. The diameters of legs to be modeled as phantoms were defined based on data derived from 53 patients. This study analyzed PET images of a NEMA phantom and three plastic bottle phantoms (diameter, 5.68, 8.54 and 10.7 cm) that simulated the human body and legs, respectively. The phantoms comprised two spheres (diameters, 10 and 17 mm) containing fluorine-18 fluorodeoxyglucose solution with sphere-to-background ratios of 4 at a background radioactivity level of 2.65 kBq/mL. All PET data were reconstructed with acquisition times ranging from 10 to 180, and 1200 s. We visually evaluated image quality and determined the coefficient of variance (CV) of the background, contrast and the quantitative %error of the hot spheres, and then determined two shorter variable-acquisition protocols for legs. Lesion detectability and quantitative accuracy determined based on maximum standardized uptake values (SUV max ) in PET images of a patient using the proposed protocols were also evaluated. A larger phantom and a shorter acquisition time resulted in increased background noise on images and decreased the contrast in hot spheres. A visual score of ≥ 1.5 was obtained when the acquisition time was ≥ 30 s for three leg phantoms, and ≥ 120 s for the NEMA phantom. The quantitative %errors of the 10- and 17-mm spheres in the leg phantoms were ± 15 and ± 10%, respectively, in PET images with a high CV (scan < 30 s). The mean SUV max of three lesions using the current fixed-acquisition and two proposed variable-acquisition time protocols in the clinical study were 3.1, 3.1 and 3.2, respectively, which did not significantly differ. Leg acquisition time per bed position of even 30-90 s allows axial equalization, uniform image noise and a maximum ± 15% quantitative accuracy for the smallest lesion. The overall acquisition time was reduced by 23-42% using the proposed shorter variable than the current fixed-acquisition time for imaging legs, indicating that this is a useful and practical protocol for routine qualitative and quantitative PET/CT assessment in the clinical setting.

  4. Study of morphological changes in breast cancer cells MCF-7 under the action of pro-apoptotic agents with laser modulation interference microscope MIM-340

    NASA Astrophysics Data System (ADS)

    Nebogatikov, V.; Nikitiuk, A.; Konysheva, A.; Ignatyev, P.; Grishko, V.; Naimark, O.

    2017-09-01

    Quantitative phase microscopy is a new method to measure and evaluate the microlevel processes characterized by the high resolution and providing ample opportunities to quantitatively analyze various parameters, including specimens from biological matter. In this study, a laser interference microscope was used to evaluate the state of cancer cells (living and apoptotic). Apoptotic cancer cells were obtained by treatment of MCF-7 cells with the use of betulin-based α-bromomethyl ketone (BMK) derivative. When using the microscope, the main differences in the morphometric parameters of living and apoptotic cells such as height, diameter, perimeter, area and volume were appraised. The criteria that can be used as markers of apoptosis activation were identified.

  5. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  6. Feedback Effects of Teaching Quality Assessment: Macro and Micro Evidence

    ERIC Educational Resources Information Center

    Bianchini, Stefano

    2014-01-01

    This study investigates the feedback effects of teaching quality assessment. Previous literature looked separately at the evolution of individual and aggregate scores to understand whether instructors and university performance depends on its past evaluation. I propose a new quantitative-based methodology, combining statistical distributions and…

  7. Higher Education Students' Attitudes towards Experiential Learning in International Business

    ERIC Educational Resources Information Center

    Chavan, Meena

    2011-01-01

    Using qualitative and quantitative analysis this paper presents a teaching model based on experiential learning in a large "International Business" unit. Preliminary analysis of 92 student evaluations determined the effectiveness of experiential learning to allow students to explore the association between theory and practice. The…

  8. Development of an epiphyte indicator of nutrient enrichment: Threshold values for seagrass epiphyte load

    EPA Science Inventory

    Metrics of epiphyte load on macrophytes were evaluated for use as quantitative biological indicators for nutrient impacts in estuarine waters, based on review and analysis of the literature on epiphytes and macrophytes, primarily seagrasses, but including some brackish and freshw...

  9. Automatic identification of the reference system based on the fourth ventricular landmarks in T1-weighted MR images.

    PubMed

    Fu, Yili; Gao, Wenpeng; Chen, Xiaoguang; Zhu, Minwei; Shen, Weigao; Wang, Shuguo

    2010-01-01

    The reference system based on the fourth ventricular landmarks (including the fastigial point and ventricular floor plane) is used in medical image analysis of the brain stem. The objective of this study was to develop a rapid, robust, and accurate method for the automatic identification of this reference system on T1-weighted magnetic resonance images. The fully automated method developed in this study consisted of four stages: preprocessing of the data set, expectation-maximization algorithm-based extraction of the fourth ventricle in the region of interest, a coarse-to-fine strategy for identifying the fastigial point, and localization of the base point. The method was evaluated on 27 Brain Web data sets qualitatively and 18 Internet Brain Segmentation Repository data sets and 30 clinical scans quantitatively. The results of qualitative evaluation indicated that the method was robust to rotation, landmark variation, noise, and inhomogeneity. The results of quantitative evaluation indicated that the method was able to identify the reference system with an accuracy of 0.7 +/- 0.2 mm for the fastigial point and 1.1 +/- 0.3 mm for the base point. It took <6 seconds for the method to identify the related landmarks on a personal computer with an Intel Core 2 6300 processor and 2 GB of random-access memory. The proposed method for the automatic identification of the reference system based on the fourth ventricular landmarks was shown to be rapid, robust, and accurate. The method has potentially utility in image registration and computer-aided surgery.

  10. ICan: An Optimized Ion-Current-Based Quantification Procedure with Enhanced Quantitative Accuracy and Sensitivity in Biomarker Discovery

    PubMed Central

    2015-01-01

    The rapidly expanding availability of high-resolution mass spectrometry has substantially enhanced the ion-current-based relative quantification techniques. Despite the increasing interest in ion-current-based methods, quantitative sensitivity, accuracy, and false discovery rate remain the major concerns; consequently, comprehensive evaluation and development in these regards are urgently needed. Here we describe an integrated, new procedure for data normalization and protein ratio estimation, termed ICan, for improved ion-current-based analysis of data generated by high-resolution mass spectrometry (MS). ICan achieved significantly better accuracy and precision, and lower false-positive rate for discovering altered proteins, over current popular pipelines. A spiked-in experiment was used to evaluate the performance of ICan to detect small changes. In this study E. coli extracts were spiked with moderate-abundance proteins from human plasma (MAP, enriched by IgY14-SuperMix procedure) at two different levels to set a small change of 1.5-fold. Forty-five (92%, with an average ratio of 1.71 ± 0.13) of 49 identified MAP protein (i.e., the true positives) and none of the reference proteins (1.0-fold) were determined as significantly altered proteins, with cutoff thresholds of ≥1.3-fold change and p ≤ 0.05. This is the first study to evaluate and prove competitive performance of the ion-current-based approach for assigning significance to proteins with small changes. By comparison, other methods showed remarkably inferior performance. ICan can be broadly applicable to reliable and sensitive proteomic survey of multiple biological samples with the use of high-resolution MS. Moreover, many key features evaluated and optimized here such as normalization, protein ratio determination, and statistical analyses are also valuable for data analysis by isotope-labeling methods. PMID:25285707

  11. Corneal topography with high-speed swept source OCT in clinical examination

    PubMed Central

    Karnowski, Karol; Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Gora, Michalina; Wojtkowski, Maciej

    2011-01-01

    We present the applicability of high-speed swept source (SS) optical coherence tomography (OCT) for quantitative evaluation of the corneal topography. A high-speed OCT device of 108,000 lines/s permits dense 3D imaging of the anterior segment within a time period of less than one fourth of second, minimizing the influence of motion artifacts on final images and topographic analysis. The swept laser performance was specially adapted to meet imaging depth requirements. For the first time to our knowledge the results of a quantitative corneal analysis based on SS OCT for clinical pathologies such as keratoconus, a cornea with superficial postinfectious scar, and a cornea 5 months after penetrating keratoplasty are presented. Additionally, a comparison with widely used commercial systems, a Placido-based topographer and a Scheimpflug imaging-based topographer, is demonstrated. PMID:21991558

  12. Objective Quantification of Pre-and Postphonosurgery Vocal Fold Vibratory Characteristics Using High-Speed Videoendoscopy and a Harmonic Waveform Model

    ERIC Educational Resources Information Center

    Ikuma, Takeshi; Kunduk, Melda; McWhorter, Andrew J.

    2014-01-01

    Purpose: The model-based quantitative analysis of high-speed videoendoscopy (HSV) data at a low frame rate of 2,000 frames per second was assessed for its clinical adequacy. Stepwise regression was employed to evaluate the HSV parameters using harmonic models and their relationships to the Voice Handicap Index (VHI). Also, the model-based HSV…

  13. Development of a Moodle Course for Schoolchildren's Table Tennis Learning Based on Competence Motivation Theory: Its Effectiveness in Comparison to Traditional Training Method

    ERIC Educational Resources Information Center

    Zou, Junhua; Liu, Qingtang; Yang, Zongkai

    2012-01-01

    Based on Competence Motivation Theory (CMT), a Moodle course for schoolchildren's table tennis learning was developed (The URL is http://www.bssepp.com, and this course allows guest access). The effects of the course on students' knowledge, perceived competence and interest were evaluated through quantitative methods. The sample of the study…

  14. Is there a core neural network in empathy? An fMRI based quantitative meta-analysis.

    PubMed

    Fan, Yan; Duncan, Niall W; de Greck, Moritz; Northoff, Georg

    2011-01-01

    Whilst recent neuroimaging studies have identified a series of different brain regions as being involved in empathy, it remains unclear concerning the activation consistence of these brain regions and their specific functional roles. Using MKDA, a whole-brain based quantitative meta-analysis of recent fMRI studies of empathy was performed. This analysis identified the dACC-aMCC-SMA and bilateral anterior insula as being consistently activated in empathy. Hypothesizing that what are here termed affective-perceptual and cognitive-evaluative forms of empathy might be characterized by different activity patterns, the neural activations in these forms of empathy were compared. The dorsal aMCC was demonstrated to be recruited more frequently in the cognitive-evaluative form of empathy, whilst the right anterior insula was found to be involved in the affective-perceptual form of empathy only. The left anterior insula was active in both forms of empathy. It was concluded that the dACC-aMCC-SMA and bilateral insula can be considered as forming a core network in empathy, and that cognitive-evaluative and affective-perceptual empathy can be distinguished at the level of regional activation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Development of in vitro and in vivo neutralization assays based on the pseudotyped H7N9 virus.

    PubMed

    Tian, Yabin; Zhao, Hui; Liu, Qiang; Zhang, Chuntao; Nie, Jianhui; Huang, Weijing; Li, Changgui; Li, Xuguang; Wang, Youchun

    2018-05-31

    H7N9 viral infections pose a great threat to both animal and human health. This avian virus cannot be handled in level 2 biocontainment laboratories, substantially hindering evaluation of prophylactic vaccines and therapeutic agents. Here, we report a high-titer pseudoviral system with a bioluminescent reporter gene, enabling us to visually and quantitatively conduct analyses of virus replications in both tissue cultures and animals. For evaluation of immunogenicity of H7N9 vaccines, we developed an in vitro assay for neutralizing antibody measurement based on the pseudoviral system; results generated by the in vitro assay were found to be strongly correlated with those by either hemagglutination inhibition (HI) or micro-neutralization (MN) assay. Furthermore, we injected the viruses into Balb/c mice and observed dynamic distributions of the viruses in the animals, which provides an ideal imaging model for quantitative analyses of prophylactic and therapeutic monoclonal antibodies. Taken together, the pseudoviral systems reported here could be of great value for both in vitro and in vivo evaluations of vaccines and antiviral agents without the need of wild type H7N9 virus.

  16. In vivo quantitative evaluation of vascular parameters for angiogenesis based on sparse principal component analysis and aggregated boosted trees

    NASA Astrophysics Data System (ADS)

    Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie

    2014-12-01

    To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.

  17. An Automatic Assessment System of Diabetic Foot Ulcers Based on Wound Area Determination, Color Segmentation, and Healing Score Evaluation.

    PubMed

    Wang, Lei; Pedersen, Peder C; Strong, Diane M; Tulu, Bengisu; Agu, Emmanuel; Ignotz, Ron; He, Qian

    2015-08-07

    For individuals with type 2 diabetes, foot ulcers represent a significant health issue. The aim of this study is to design and evaluate a wound assessment system to help wound clinics assess patients with foot ulcers in a way that complements their current visual examination and manual measurements of their foot ulcers. The physical components of the system consist of an image capture box, a smartphone for wound image capture and a laptop for analyzing the wound image. The wound image assessment algorithms calculate the overall wound area, color segmented wound areas, and a healing score, to provide a quantitative assessment of the wound healing status both for a single wound image and comparisons of subsequent images to an initial wound image. The system was evaluated by assessing foot ulcers for 12 patients in the Wound Clinic at University of Massachusetts Medical School. As performance measures, the Matthews correlation coefficient (MCC) value for the wound area determination algorithm tested on 32 foot ulcer images was .68. The clinical validity of our healing score algorithm relative to the experienced clinicians was measured by Krippendorff's alpha coefficient (KAC) and ranged from .42 to .81. Our system provides a promising real-time method for wound assessment based on image analysis. Clinical comparisons indicate that the optimized mean-shift-based algorithm is well suited for wound area determination. Clinical evaluation of our healing score algorithm shows its potential to provide clinicians with a quantitative method for evaluating wound healing status. © 2015 Diabetes Technology Society.

  18. Averaged ratio between complementary profiles for evaluating shape distortions of map projections and spherical hierarchical tessellations

    NASA Astrophysics Data System (ADS)

    Yan, Jin; Song, Xiao; Gong, Guanghong

    2016-02-01

    We describe a metric named averaged ratio between complementary profiles to represent the distortion of map projections, and the shape regularity of spherical cells derived from map projections or non-map-projection methods. The properties and statistical characteristics of our metric are investigated. Our metric (1) is a variable of numerical equivalence to both scale component and angular deformation component of Tissot indicatrix, and avoids the invalidation when using Tissot indicatrix and derived differential calculus for evaluating non-map-projection based tessellations where mathematical formulae do not exist (e.g., direct spherical subdivisions), (2) exhibits simplicity (neither differential nor integral calculus) and uniformity in the form of calculations, (3) requires low computational cost, while maintaining high correlation with the results of differential calculus, (4) is a quasi-invariant under rotations, and (5) reflects the distortions of map projections, distortion of spherical cells, and the associated distortions of texels. As an indicator of quantitative evaluation, we investigated typical spherical tessellation methods, some variants of tessellation methods, and map projections. The tessellation methods we evaluated are based on map projections or direct spherical subdivisions. The evaluation involves commonly used Platonic polyhedrons, Catalan polyhedrons, etc. Quantitative analyses based on our metric of shape regularity and an essential metric of area uniformity implied that (1) Uniform Spherical Grids and its variant show good qualities in both area uniformity and shape regularity, and (2) Crusta, Unicube map, and a variant of Unicube map exhibit fairly acceptable degrees of area uniformity and shape regularity.

  19. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  20. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  1. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging

    PubMed Central

    Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-01-01

    Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment. PMID:28540065

  2. Virus replication as a phenotypic version of polynucleotide evolution.

    PubMed

    Antoneli, Fernando; Bosco, Francisco; Castro, Diogo; Janini, Luiz Mario

    2013-04-01

    In this paper, we revisit and adapt to viral evolution an approach based on the theory of branching process advanced by Demetrius et al. (Bull. Math. Biol. 46:239-262, 1985), in their study of polynucleotide evolution. By taking into account beneficial effects, we obtain a non-trivial multivariate generalization of their single-type branching process model. Perturbative techniques allows us to obtain analytical asymptotic expressions for the main global parameters of the model, which lead to the following rigorous results: (i) a new criterion for "no sure extinction", (ii) a generalization and proof, for this particular class of models, of the lethal mutagenesis criterion proposed by Bull et al. (J. Virol. 18:2930-2939, 2007), (iii) a new proposal for the notion of relaxation time with a quantitative prescription for its evaluation, (iv) the quantitative description of the evolution of the expected values in four distinct "stages": extinction threshold, lethal mutagenesis, stationary "equilibrium", and transient. Finally, based on these quantitative results, we are able to draw some qualitative conclusions.

  3. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  4. Effects of normalization on quantitative traits in association test

    PubMed Central

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  5. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  6. Towards Measurement of Confidence in Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim

    2011-01-01

    Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations

  7. A CT-based software tool for evaluating compensator quality in passively scattered proton therapy

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zhang, Lifei; Dong, Lei; Sahoo, Narayan; Gillin, Michael T.; Zhu, X. Ronald

    2010-11-01

    We have developed a quantitative computed tomography (CT)-based quality assurance (QA) tool for evaluating the accuracy of manufactured compensators used in passively scattered proton therapy. The thickness of a manufactured compensator was measured from its CT images and compared with the planned thickness defined by the treatment planning system. The difference between the measured and planned thicknesses was calculated with use of the Euclidean distance transformation and the kd-tree search method. Compensator accuracy was evaluated by examining several parameters including mean distance, maximum distance, global thickness error and central axis shifts. Two rectangular phantoms were used to validate the performance of the QA tool. Nine patients and 20 compensators were included in this study. We found that mean distances, global thickness errors and central axis shifts were all within 1 mm for all compensators studied, with maximum distances ranging from 1.1 to 3.8 mm. Although all compensators passed manual verification at selected points, about 5% of the pixels still had maximum distances of >2 mm, most of which correlated with large depth gradients. The correlation between the mean depth gradient of the compensator and the percentage of pixels with mean distance <1 mm is -0.93 with p < 0.001, which suggests that the mean depth gradient is a good indicator of compensator complexity. These results demonstrate that the CT-based compensator QA tool can be used to quantitatively evaluate manufactured compensators.

  8. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  9. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  10. Quantitative analysis of amygdalin and prunasin in Prunus serotina Ehrh. using (1) H-NMR spectroscopy.

    PubMed

    Santos Pimenta, Lúcia P; Schilthuizen, Menno; Verpoorte, Robert; Choi, Young Hae

    2014-01-01

    Prunus serotina is native to North America but has been invasively introduced in Europe since the seventeenth century. This plant contains cyanogenic glycosides that are believed to be related to its success as an invasive plant. For these compounds, chromatographic- or spectrometric-based (targeting on HCN hydrolysis) methods of analysis have been employed so far. However, the conventional methods require tedious preparation steps and a long measuring time. To develop a fast and simple method to quantify the cyanogenic glycosides, amygdalin and prunasin in dried Prunus serotina leaves without any pre-purification steps using (1) H-NMR spectroscopy. Extracts of Prunus serotina leaves using CH3 OH-d4 and KH2 PO4 buffer in D2 O (1:1) were quantitatively analysed for amygdalin and prunasin using (1) H-NMR spectroscopy. Different internal standards were evaluated for accuracy and stability. The purity of quantitated (1) H-NMR signals was evaluated using several two-dimensional NMR experiments. Trimethylsilylpropionic acid sodium salt-d4 proved most suitable as the internal standard for quantitative (1) H-NMR analysis. Two-dimensional J-resolved NMR was shown to be a useful tool to confirm the structures and to check for possible signal overlapping with the target signals for the quantitation. Twenty-two samples of P. serotina were subsequently quantitatively analysed for the cyanogenic glycosides prunasin and amygdalin. The NMR method offers a fast, high-throughput analysis of cyanogenic glycosides in dried leaves permitting simultaneous quantification and identification of prunasin and amygdalin in Prunus serotina. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract

    PubMed Central

    2017-01-01

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513

  12. The Evaluator's Perspective: Evaluating the State Capacity Building Program.

    ERIC Educational Resources Information Center

    Madey, Doren L.

    A historical antagonism between the advocates of quantitative evaluation methods and the proponents of qualitative evaluation methods has stymied the recognition of the value to be gained by utilizing both methodologies in the same study. The integration of quantitative and qualitative methods within a single evaluation has synergistic effects in…

  13. Computer-aided diagnosis of prostate cancer in the peripheral zone using multiparametric MRI

    NASA Astrophysics Data System (ADS)

    Niaf, Emilie; Rouvière, Olivier; Mège-Lechevallier, Florence; Bratan, Flavie; Lartizien, Carole

    2012-06-01

    This study evaluated a computer-assisted diagnosis (CADx) system for determining a likelihood measure of prostate cancer presence in the peripheral zone (PZ) based on multiparametric magnetic resonance (MR) imaging, including T2-weighted, diffusion-weighted and dynamic contrast-enhanced MRI at 1.5 T. Based on a feature set derived from grey-level images, including first-order statistics, Haralick features, gradient features, semi-quantitative and quantitative (pharmacokinetic modelling) dynamic parameters, four kinds of classifiers were trained and compared : nonlinear support vector machine (SVM), linear discriminant analysis, k-nearest neighbours and naïve Bayes classifiers. A set of feature selection methods based on t-test, mutual information and minimum-redundancy-maximum-relevancy criteria were also compared. The aim was to discriminate between the relevant features as well as to create an efficient classifier using these features. The diagnostic performances of these different CADx schemes were evaluated based on a receiver operating characteristic (ROC) curve analysis. The evaluation database consisted of 30 sets of multiparametric MR images acquired from radical prostatectomy patients. Using histologic sections as the gold standard, both cancer and nonmalignant (but suspicious) tissues were annotated in consensus on all MR images by two radiologists, a histopathologist and a researcher. Benign tissue regions of interest (ROIs) were also delineated in the remaining prostate PZ. This resulted in a series of 42 cancer ROIs, 49 benign but suspicious ROIs and 124 nonsuspicious benign ROIs. From the outputs of all evaluated feature selection methods on the test bench, a restrictive set of about 15 highly informative features coming from all MR sequences was discriminated, thus confirming the validity of the multiparametric approach. Quantitative evaluation of the diagnostic performance yielded a maximal area under the ROC curve (AUC) of 0.89 (0.81-0.94) for the discrimination of the malignant versus nonmalignant tissues and 0.82 (0.73-0.90) for the discrimination of the malignant versus suspicious tissues when combining the t-test feature selection approach with a SVM classifier. A preliminary comparison showed that the optimal CADx scheme mimicked, in terms of AUC, the human experts in differentiating malignant from suspicious tissues, thus demonstrating its potential for assisting cancer identification in the PZ.

  14. Post Advanced Technology Implementation Effects on School Psychologist Job Performance

    ERIC Educational Resources Information Center

    Hobson, Rana Dirice

    2017-01-01

    The technology acceptance model (TAM) has been widely used to assess technology adoption in business, education, and health care. The New York City Department of Education (NYCDOE) launched a web-based Individualized Educational Program (IEP) system for school psychologists to use in conducting evaluations and reviews. This quantitative study…

  15. High-fidelity detection of crop biomass quantitative trait loci from low-cost imaging in the field

    USDA-ARS?s Scientific Manuscript database

    Field-based, rapid, and non-destructive techniques for assessing plant productivity can accelerate the discovery of genotype-to-phenotype relationships needed to improve next-generation biomass grass crops. The use of hemispherical imaging and light attenuation modeling was evaluated against destruc...

  16. The Intercultural Sensitivity of Chilean Teachers Serving an Immigrant Population in Schools

    ERIC Educational Resources Information Center

    Morales Mendoza, Karla; Sanhueza Henríquez, Susan; Friz Carrillo, Miguel; Riquelme Bravo, Paula

    2017-01-01

    The objective of this article is to evaluate the intercultural sensitivity of teachers working in culturally diverse classrooms, and to analyse differences in intercultural sensitivity based on the gender, age, training (advanced training courses), and intercultural experience of the teachers. A quantitative approach with a comparative descriptive…

  17. Developing a Survey of Transformative Learning Outcomes and Processes Based on Theoretical Principles

    ERIC Educational Resources Information Center

    Stuckey, Heather L.; Taylor, Edward W.; Cranton, Patricia

    2013-01-01

    The purpose of this research was to develop an inclusive evaluation of "transformative learning theory" that encompassed varied perspectives of transformative learning. We constructed a validated quantitative survey to assess the potential outcomes and processes of how transformative learning may be experienced by college-educated…

  18. Peer Instruction: An Evaluation of Its Theory, Application, and Contribution

    ERIC Educational Resources Information Center

    Gok, Tolga; Gok, Ozge

    2017-01-01

    Many qualitative and quantitative studies performed on peer instruction based on interactive engagement method used in many different disciplines and courses were reviewed in the present study. The researchers examined the effects of peer instruction on students' cognitive skills (conceptual learning, problem solving, reasoning ability, etc.) and…

  19. Rapid and Specific Method for Evaluating Streptomyces Competitive Dynamics in Complex Soil Communities

    USDA-ARS?s Scientific Manuscript database

    Quantifying target microbial populations in complex communities remains a barrier to studying species interactions in soil environments. Quantitative real-time PCR (qPCR) offers a rapid and specific means to assess populations of target microorganisms. SYBR Green and TaqMan-based qPCR assays were de...

  20. Evaluation of Instructional Design Capabilities of Asynchronous and Synchronous Instruction

    ERIC Educational Resources Information Center

    Garrett, Kristi N.; Benson, Angela D.

    2017-01-01

    From a quantitative perspective, this study examined the instructional design knowledge of higher education instructors and others within the instructional design/technology arena who are members of a global educational based Internet forum. Results showed significant difference in opinions between genders, where males were more inclined to…

  1. Developing International Managers: The Contribution of Cultural Experience to Learning

    ERIC Educational Resources Information Center

    Townsend, Peter; Regan, Padraic; Li, Liang Liang

    2015-01-01

    Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…

  2. An Investigation of Basic Design Capacity Performance in Different Background Students

    ERIC Educational Resources Information Center

    Cheng, Chu-Yu; Ou, Yang-Kun

    2017-01-01

    The technological and vocational higher education system in Taiwan is offering an undergraduate degree for design-based vocational high school students and general high school students whose qualitative and quantitative abilities are evaluated through a student selection examination. This study focused on the conceptual understandings of 64…

  3. Improving Intelligibility: Guided Reflective Journals in Action

    ERIC Educational Resources Information Center

    Lear, Emmaline L.

    2014-01-01

    This study explores the effectiveness of guided reflective journals to improve intelligibility in a Japanese higher educational context. Based on qualitative and quantitative methods, the paper evaluates changes in speech over the duration of one semester. In particular, this study focuses on changes in prosodic features such as stress, intonation…

  4. Perceived School Effectiveness: Case Study of a Liverpool College

    ERIC Educational Resources Information Center

    Samy, M.; Cook, K.

    2009-01-01

    Purpose: A quantitative effectiveness measurement based on the perceptions of the local community has been established as an effective mode of evaluating the level of satisfaction or perceived effectiveness of a school. In order to measure the level of effectiveness as perceived by their communities, educational institutions could use this…

  5. 77 FR 5797 - Draft Toxicological Review of Vanadium Pentoxide: In Support of Summary Information on the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... Office of Research and Development. EPA is releasing this draft assessment for the purposes of public... health assessment program that evaluates quantitative and qualitative risk information on effects that..., EPA provides the highest quality science- based human health assessments to support the Agency's...

  6. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  7. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  8. Evaluation of a reduced section modulus model for determining effects of incising on bending strength and stiffness of structural lumber

    Treesearch

    Roland Hernandez; Jerrold E. Winandy

    2005-01-01

    A quantitative model is presented for evaluating the effects of incising on the bending strength and stiffness of structural dimension lumber. This model is based on the premise that bending strength and stiffness are reduced when lumber is incised, and the extent of this reduction is related to the reduction in moment of inertia of the bending members. Measurements of...

  9. Evaluation Criteria for Nursing Student Application of Evidence-Based Practice: A Delphi Study.

    PubMed

    Bostwick, Lina; Linden, Lois

    2016-06-01

    Core clinical evaluation criteria do not exist for measuring prelicensure baccalaureate nursing students' application of evidence-based practice (EBP) during direct care assignments. The study objective was to achieve consensus among EBP nursing experts to create clinical criteria for faculty to use in evaluating students' application of EBP principles. A three-round Delphi method was used. Experts were invited to participate in Web-based surveys. Data were analyzed using qualitative coding and categorizing. Quantitative analyses were descriptive calculations for rating and ranking. Expert consensus occurred in the Delphi rounds. The study provides a set of 10 core clinical evaluation criteria for faculty evaluating students' progression toward competency in their application of EBP. A baccalaureate program curriculum requiring the use of Bostwick's EBP Core Clinical Evaluation Criteria will provide a clear definition for understanding basic core EBP competence as expected for the assessment of student learning. [J Nurs Educ. 2016;55(5):336-341.]. Copyright 2016, SLACK Incorporated.

  10. On Trust Evaluation in Mobile Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Nguyen, Dang Quan; Lamont, Louise; Mason, Peter C.

    Trust has been considered as a social relationship between two individuals in human society. But, as computer science and networking have succeeded in using computers to automate many tasks, the concept of trust can be generalized to cover the reliability and relationships of non-human interaction, such as, for example, information gathering and data routing. This paper investigates the evaluation of trust in the context of ad hoc networks. Nodes evaluate each other’s behaviour based on observables. A node then decides whether to trust another node to have certain innate abilities. We show how accurate such an evaluation could be. We also provide the minimum number of observations required to obtain an accurate evaluation, a result that indicates that observation-based trust in ad hoc networks will remain a challenging problem. The impact of making networking decisions using trust evaluation on the network connectivity is also examined. In this manner, quantitative decisions can be made concerning trust-based routing with the knowledge of the potential impact on connectivity.

  11. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes.

    PubMed

    Popelka, Stanislav; Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka

    2016-01-01

    The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.

  12. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes

    PubMed Central

    Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka

    2016-01-01

    The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe. PMID:27087805

  13. Quantification of EEG reactivity in comatose patients.

    PubMed

    Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas

    2016-01-01

    EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses

    PubMed Central

    Alexander, Elsinore; Wei, Xin; Lee, Shinwook

    2018-01-01

    Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526

  15. Comparison of 18F-FDG PET/CT and PET/MRI in patients with multiple myeloma

    PubMed Central

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Mosebach, Jennifer; Pan, Leyun; Schlemmer, Heinz-Peter; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2015-01-01

    PET/MRI represents a promising hybrid imaging modality with several potential clinical applications. Although PET/MRI seems highly attractive in the diagnostic approach of multiple myeloma (MM), its role has not yet been evaluated. The aims of this prospective study are to evaluate the feasibility of 18F-FDG PET/MRI in detection of MM lesions, and to investigate the reproducibility of bone marrow lesions detection and quantitative data of 18F-FDG uptake between the functional (PET) component of PET/CT and PET/MRI in MM patients. The study includes 30 MM patients. All patients initially underwent 18F-FDG PET/CT (60 min p.i.), followed by PET/MRI (120 min p.i.). PET/CT and PET/MRI data were assessed and compared based on qualitative (lesion detection) and quantitative (SUV) evaluation. The hybrid PET/MRI system provided good image quality in all cases without artefacts. PET/MRI identified 65 of the 69 lesions, which were detectable with PET/CT (94.2%). Quantitative PET evaluations showed the following mean values in MM lesions: SUVaverage=5.5 and SUVmax=7.9 for PET/CT; SUVaverage=3.9 and SUVmax=5.8 for PET/MRI. Both SUVaverage and SUVmax were significantly higher on PET/CT than on PET/MRI. Spearman correlation analysis demonstrated a strong correlation between both lesional SUVaverage (r=0.744) and lesional SUVmax (r=0.855) values derived from PET/CT and PET/MRI. Regarding detection of myeloma skeletal lesions, PET/MRI exhibited equivalent performance to PET/CT. In terms of tracer uptake quantitation, a significant correlation between the two techniques was demonstrated, despite the statistically significant differences in lesional SUVs between PET/CT and PET/MRI. PMID:26550538

  16. Variations in optical coherence tomography resolution and uniformity: a multi-system performance comparison

    PubMed Central

    Fouad, Anthony; Pfefer, T. Joshua; Chen, Chao-Wei; Gong, Wei; Agrawal, Anant; Tomlins, Peter H.; Woolliams, Peter D.; Drezek, Rebekah A.; Chen, Yu

    2014-01-01

    Point spread function (PSF) phantoms based on unstructured distributions of sub-resolution particles in a transparent matrix have been demonstrated as a useful tool for evaluating resolution and its spatial variation across image volumes in optical coherence tomography (OCT) systems. Measurements based on PSF phantoms have the potential to become a standard test method for consistent, objective and quantitative inter-comparison of OCT system performance. Towards this end, we have evaluated three PSF phantoms and investigated their ability to compare the performance of four OCT systems. The phantoms are based on 260-nm-diameter gold nanoshells, 400-nm-diameter iron oxide particles and 1.5-micron-diameter silica particles. The OCT systems included spectral-domain and swept source systems in free-beam geometries as well as a time-domain system in both free-beam and fiberoptic probe geometries. Results indicated that iron oxide particles and gold nanoshells were most effective for measuring spatial variations in the magnitude and shape of PSFs across the image volume. The intensity of individual particles was also used to evaluate spatial variations in signal intensity uniformity. Significant system-to-system differences in resolution and signal intensity and their spatial variation were readily quantified. The phantoms proved useful for identification and characterization of irregularities such as astigmatism. Our multi-system results provide evidence of the practical utility of PSF-phantom-based test methods for quantitative inter-comparison of OCT system resolution and signal uniformity. PMID:25071949

  17. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  18. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  19. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  20. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  1. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  2. Quantitative evaluation of 3D dosimetry for stereotactic volumetric‐modulated arc delivery using COMPASS

    PubMed Central

    Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2014-01-01

    The purpose of this study was to evaluate quantitatively the patient‐specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric‐modulated arc delivery. Twenty‐five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric‐modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5–20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)‐calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS‐calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose‐volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of volume (D20), dose at 50% of volume (D50), and maximum point doses were evaluated. Comparison was carried out using gamma analysis with passing criteria of 3 mm and 3%. Mean deviation of 1.9%±1% was observed for dose at 95% of volume (D95) of target volumes, whereas much less difference was noticed for critical organs. However, significant dose difference was noticed in two cases due to the smaller tumor size. Evaluation of this study revealed that the COMPASS 3D dosimetry is efficient and easy to use for patient‐specific QA of VMAT stereotactic delivery. 3D dosimetric QA with COMPASS provides additional degrees of freedom to check the high‐dose modulated stereotactic delivery with very high precision on patient CT images. PACS numbers: 87.55.Qr, 87.56.Fc PMID:25679152

  3. 77 FR 1761 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...

  4. Quantitative analysis of 18F-NaF dynamic PET/CT cannot differentiate malignant from benign lesions in multiple myeloma.

    PubMed

    Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-01-01

    A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18 F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18 F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18 F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18 F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUV average for MM lesions was 11.9 and mean SUV max was 23.2. Respectively, SUV average and SUV max for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18 F-NaF revealed the following mean values for MM lesions: K 1 = 0.248 (1/min), k 3 = 0.359 (1/min), influx (K i ) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K 1 = 0.169 (1/min), k 3 = 0.422 (1/min), influx (K i ) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUV average , SUV max , K 1 , k 3 and influx (K i ) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18 F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18 F-NaF PET/CT in the diagnostic workup of MM.

  5. Ultrasonic test of resistance spot welds based on wavelet package analysis.

    PubMed

    Liu, Jing; Xu, Guocheng; Gu, Xiaopeng; Zhou, Guanghao

    2015-02-01

    In this paper, ultrasonic test of spot welds for stainless steel sheets has been studied. It is indicated that traditional ultrasonic signal analysis in either time domain or frequency domain remains inadequate to evaluate the nugget diameter of spot welds. However, the method based on wavelet package analysis in time-frequency domain can easily distinguish the nugget from the corona bond by extracting high-frequency signals in different positions of spot welds, thereby quantitatively evaluating the nugget diameter. The results of ultrasonic test fit the actual measured value well. Mean value of normal distribution of error statistics is 0.00187, and the standard deviation is 0.1392. Furthermore, the quality of spot welds was evaluated, and it is showed ultrasonic nondestructive test based on wavelet packet analysis can be used to evaluate the quality of spot welds, and it is more reliable than single tensile destructive test. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  7. A High-Throughput Method for Direct Detection of Therapeutic Oligonucleotide-Induced Gene Silencing In Vivo

    PubMed Central

    Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil

    2016-01-01

    Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721

  8. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  9. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator.

    PubMed

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun

    2011-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.

  10. A systematic review of studies evaluating Australian indigenous community development projects: the extent of community participation, their methodological quality and their outcomes.

    PubMed

    Snijder, Mieke; Shakeshaft, Anthony; Wagemakers, Annemarie; Stephens, Anne; Calabria, Bianca

    2015-11-21

    Community development is a health promotion approach identified as having great potential to improve Indigenous health, because of its potential for extensive community participation. There has been no systematic examination of the extent of community participation in community development projects and little analysis of their effectiveness. This systematic review aims to identify the extent of community participation in community development projects implemented in Australian Indigenous communities, critically appraise the qualitative and quantitative methods used in their evaluation, and summarise their outcomes. Ten electronic peer-reviewed databases and two electronic grey literature databases were searched for relevant studies published between 1990 and 2015. The level of community participation and the methodological quality of the qualitative and quantitative components of the studies were assessed against standardised criteria. Thirty one evaluation studies of community development projects were identified. Community participation varied between different phases of project development, generally high during project implementation, but low during the evaluation phase. For the majority of studies, methodological quality was low and the methods were poorly described. Although positive qualitative or quantitative outcomes were reported in all studies, only two studies reported statistically significant outcomes. Partnerships between researchers, community members and service providers have great potential to improve methodological quality and community participation when research skills and community knowledge are integrated to design, implement and evaluate community development projects. The methodological quality of studies evaluating Australian Indigenous community development projects is currently too weak to confidently determine the cost-effectiveness of community development projects in improving the health and wellbeing of Indigenous Australians. Higher quality studies evaluating community development projects would strengthen the evidence base.

  11. Quantitative multi-pinhole small-animal SPECT: uniform versus non-uniform Chang attenuation correction.

    PubMed

    Wu, C; de Jong, J R; Gratama van Andel, H A; van der Have, F; Vastenhouw, B; Laverman, P; Boerman, O C; Dierckx, R A J O; Beekman, F J

    2011-09-21

    Attenuation of photon flux on trajectories between the source and pinhole apertures affects the quantitative accuracy of reconstructed single-photon emission computed tomography (SPECT) images. We propose a Chang-based non-uniform attenuation correction (NUA-CT) for small-animal SPECT/CT with focusing pinhole collimation, and compare the quantitative accuracy with uniform Chang correction based on (i) body outlines extracted from x-ray CT (UA-CT) and (ii) on hand drawn body contours on the images obtained with three integrated optical cameras (UA-BC). Measurements in phantoms and rats containing known activities of isotopes were conducted for evaluation. In (125)I, (201)Tl, (99m)Tc and (111)In phantom experiments, average relative errors comparing to the gold standards measured in a dose calibrator were reduced to 5.5%, 6.8%, 4.9% and 2.8%, respectively, with NUA-CT. In animal studies, these errors were 2.1%, 3.3%, 2.0% and 2.0%, respectively. Differences in accuracy on average between results of NUA-CT, UA-CT and UA-BC were less than 2.3% in phantom studies and 3.1% in animal studies except for (125)I (3.6% and 5.1%, respectively). All methods tested provide reasonable attenuation correction and result in high quantitative accuracy. NUA-CT shows superior accuracy except for (125)I, where other factors may have more impact on the quantitative accuracy than the selected attenuation correction.

  12. Conventional liquid chromatography/triple quadrupole mass spectrometer-based metabolite identification and semi-quantitative estimation approach in the investigation of dabigatran etexilate in vitro metabolism

    PubMed Central

    Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey

    2012-01-01

    Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178

  13. A systematic review on how to conduct evaluations in community-based rehabilitation.

    PubMed

    Grandisson, Marie; Hébert, Michèle; Thibeault, Rachel

    2014-01-01

    Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations.

  14. A systematic review on how to conduct evaluations in community-based rehabilitation

    PubMed Central

    Hébert, Michèle; Thibeault, Rachel

    2014-01-01

    Purpose Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. Method A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Results Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. Conclusions In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations. PMID:23614357

  15. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    NASA Astrophysics Data System (ADS)

    David, S.; Visvikis, D.; Roux, C.; Hatt, M.

    2011-09-01

    In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.

  16. Development of risk-based nanomaterial groups for occupational exposure control

    NASA Astrophysics Data System (ADS)

    Kuempel, E. D.; Castranova, V.; Geraci, C. L.; Schulte, P. A.

    2012-09-01

    Given the almost limitless variety of nanomaterials, it will be virtually impossible to assess the possible occupational health hazard of each nanomaterial individually. The development of science-based hazard and risk categories for nanomaterials is needed for decision-making about exposure control practices in the workplace. A possible strategy would be to select representative (benchmark) materials from various mode of action (MOA) classes, evaluate the hazard and develop risk estimates, and then apply a systematic comparison of new nanomaterials with the benchmark materials in the same MOA class. Poorly soluble particles are used here as an example to illustrate quantitative risk assessment methods for possible benchmark particles and occupational exposure control groups, given mode of action and relative toxicity. Linking such benchmark particles to specific exposure control bands would facilitate the translation of health hazard and quantitative risk information to the development of effective exposure control practices in the workplace. A key challenge is obtaining sufficient dose-response data, based on standard testing, to systematically evaluate the nanomaterials' physical-chemical factors influencing their biological activity. Categorization processes involve both science-based analyses and default assumptions in the absence of substance-specific information. Utilizing data and information from related materials may facilitate initial determinations of exposure control systems for nanomaterials.

  17. Estimation of hydrolysis rate constants for carbamates ...

    EPA Pesticide Factsheets

    Cheminformatics based tools, such as the Chemical Transformation Simulator under development in EPA’s Office of Research and Development, are being increasingly used to evaluate chemicals for their potential to degrade in the environment or be transformed through metabolism. Hydrolysis represents a major environmental degradation pathway; unfortunately, only a small fraction of hydrolysis rates for about 85,000 chemicals on the Toxic Substances Control Act (TSCA) inventory are in public domain, making it critical to develop in silico approaches to estimate hydrolysis rate constants. In this presentation, we compare three complementary approaches to estimate hydrolysis rates for carbamates, an important chemical class widely used in agriculture as pesticides, herbicides and fungicides. Fragment-based Quantitative Structure Activity Relationships (QSARs) using Hammett-Taft sigma constants are widely published and implemented for relatively simple functional groups such as carboxylic acid esters, phthalate esters, and organophosphate esters, and we extend these to carbamates. We also develop a pKa based model and a quantitative structure property relationship (QSPR) model, and evaluate them against measured rate constants using R square and root mean square (RMS) error. Our work shows that for our relatively small sample size of carbamates, a Hammett-Taft based fragment model performs best, followed by a pKa and a QSPR model. This presentation compares three comp

  18. Diagnosis of breast cancer biopsies using quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Kandel, Mikhail E.; Han, Kevin; Luo, Zelun; Macias, Virgilia; Tangella, Krishnarao; Balla, Andre; Popescu, Gabriel

    2015-03-01

    The standard practice in the histopathology of breast cancers is to examine a hematoxylin and eosin (H&E) stained tissue biopsy under a microscope. The pathologist looks at certain morphological features, visible under the stain, to diagnose whether a tumor is benign or malignant. This determination is made based on qualitative inspection making it subject to investigator bias. Furthermore, since this method requires a microscopic examination by the pathologist it suffers from low throughput. A quantitative, label-free and high throughput method for detection of these morphological features from images of tissue biopsies is, hence, highly desirable as it would assist the pathologist in making a quicker and more accurate diagnosis of cancers. We present here preliminary results showing the potential of using quantitative phase imaging for breast cancer screening and help with differential diagnosis. We generated optical path length maps of unstained breast tissue biopsies using Spatial Light Interference Microscopy (SLIM). As a first step towards diagnosis based on quantitative phase imaging, we carried out a qualitative evaluation of the imaging resolution and contrast of our label-free phase images. These images were shown to two pathologists who marked the tumors present in tissue as either benign or malignant. This diagnosis was then compared against the diagnosis of the two pathologists on H&E stained tissue images and the number of agreements were counted. In our experiment, the agreement between SLIM and H&E based diagnosis was measured to be 88%. Our preliminary results demonstrate the potential and promise of SLIM for a push in the future towards quantitative, label-free and high throughput diagnosis.

  19. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  20. Fabrication, Characterization, and Evaluation of Bionanocomposites Based on Natural Polymers and Antibiotics for Wound Healing Applications.

    PubMed

    Rădulescu, Marius; Holban, Alina Maria; Mogoantă, Laurențiu; Bălşeanu, Tudor-Adrian; Mogoșanu, George Dan; Savu, Diana; Popescu, Roxana Cristina; Fufă, Oana; Grumezescu, Alexandru Mihai; Bezirtzoglou, Eugenia; Lazar, Veronica; Chifiriuc, Mariana Carmen

    2016-06-10

    The aim of our research activity was to obtain a biocompatible nanostructured composite based on naturally derived biopolymers (chitin and sodium alginate) loaded with commercial antibiotics (either Cefuroxime or Cefepime) with dual functions, namely promoting wound healing and assuring the local delivery of the loaded antibiotic. Compositional, structural, and morphological evaluations were performed by using the thermogravimetric analysis (TGA), scanning electron microscopy (SEM), and fourier transform infrared spectroscopy (FTIR) analytical techniques. In order to quantitatively and qualitatively evaluate the biocompatibility of the obtained composites, we performed the tetrazolium-salt (MTT) and agar diffusion in vitro assays on the L929 cell line. The evaluation of antimicrobial potential was evaluated by the viable cell count assay on strains belonging to two clinically relevant bacterial species (i.e., Escherichia coli and Staphylococcus aureus).

  1. Identification and Quantification of Gingerols and Related Compounds in Ginger Dietary Supplements Using High Performance Liquid Chromatography-Tandem Mass Spectrometry

    PubMed Central

    TAO, YI; LI, WENKUI; LIANG, WENZHONG; VAN BREEMEN, RICHARD B.

    2009-01-01

    Dietary supplements containing preparations of ginger roots/rhizomes (Zingiber officinale Roscoe) are being used by consumers, and clinical trials using ginger dietary supplements have been carried out to evaluate their anti-inflammatory or anti-emetic properties with inconsistent results. Chemical standardization of these products is needed for quality control and to facilitate the design of clinical trials and the evaluation of data from these studies. To address this issue, methods based on liquid chromatography-tandem mass spectrometry (LC-MS-MS) were developed for the detection, characterization and quantitative analysis of gingerol-related compounds in botanical dietary supplements containing ginger roots/rhizomes. During negative ion electrospray with collision induced-dissociation, the cleavage of the C4-C5 bond with a neutral loss of 194 u and benzylic cleavage leading to the neutral loss of 136 u were found to be class characteristic fragmentation patterns of the pharmacologically active gingerols or shogaols, respectively. Based on these results, an assay using LC-MS-MS with neutral loss scanning (loss of 194 u or 136 u) was developed that is suitable for the fingerprinting of ginger dietary supplements based on the selective detection of gingerols, shogaols, paradols, and gingerdiones. In addition, a quantitative assay based on LC-MS-MS with selected reaction monitoring was developed for the quantitative analysis of 6-gingerol, 8-gingerol, 10-gingerol, 6-shogaol, 8-shogaol, and 10-shogaol in ginger dietary supplements. After method validation, the quantities of these compounds in three commercially available ginger dietary supplements were determined. This assay showed excellent sensitivity, accuracy and precision and may be used to address the need for quality control and standardization of ginger dietary supplements. PMID:19817455

  2. Performance assessments of Android-powered military applications operating on tactical handheld devices

    NASA Astrophysics Data System (ADS)

    Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig

    2013-05-01

    Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.

  3. Designing a mixed methods study in primary care.

    PubMed

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  4. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  6. Quantitative Estimation of Plasma Free Drug Fraction in Patients With Varying Degrees of Hepatic Impairment: A Methodological Evaluation.

    PubMed

    Li, Guo-Fu; Yu, Guo; Li, Yanfei; Zheng, Yi; Zheng, Qing-Shan; Derendorf, Hartmut

    2018-07-01

    Quantitative prediction of unbound drug fraction (f u ) is essential for scaling pharmacokinetics through physiologically based approaches. However, few attempts have been made to evaluate the projection of f u values under pathological conditions. The primary objective of this study was to predict f u values (n = 105) of 56 compounds with or without the information of predominant binding protein in patients with varying degrees of hepatic insufficiency by accounting for quantitative changes in molar concentrations of either the major binding protein or albumin plus alpha 1-acid glycoprotein associated with differing levels of hepatic dysfunction. For the purpose of scaling, data pertaining to albumin and α1-acid glycoprotein levels in response to differing degrees of hepatic impairment were systematically collected from 919 adult donors. The results of the present study demonstrate for the first time the feasibility of physiologically based scaling f u in hepatic dysfunction after verifying with experimentally measured data of a wide variety of compounds from individuals with varying degrees of hepatic insufficiency. Furthermore, the high level of predictive accuracy indicates that the inter-relation between the severity of hepatic impairment and these plasma protein levels are physiologically accurate. The present study enhances the confidence in predicting f u in hepatic insufficiency, particularly for albumin-bound drugs. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  8. TElehealth in CHronic disease: mixed-methods study to develop the TECH conceptual model for intervention design and evaluation.

    PubMed

    Salisbury, Chris; Thomas, Clare; O'Cathain, Alicia; Rogers, Anne; Pope, Catherine; Yardley, Lucy; Hollinghurst, Sandra; Fahey, Tom; Lewis, Glyn; Large, Shirley; Edwards, Louisa; Rowsell, Alison; Segar, Julia; Brownsell, Simon; Montgomery, Alan A

    2015-02-06

    To develop a conceptual model for effective use of telehealth in the management of chronic health conditions, and to use this to develop and evaluate an intervention for people with two exemplar conditions: raised cardiovascular disease risk and depression. The model was based on several strands of evidence: a metareview and realist synthesis of quantitative and qualitative evidence on telehealth for chronic conditions; a qualitative study of patients' and health professionals' experience of telehealth; a quantitative survey of patients' interest in using telehealth; and review of existing models of chronic condition management and evidence-based treatment guidelines. Based on these evidence strands, a model was developed and then refined at a stakeholder workshop. Then a telehealth intervention ('Healthlines') was designed by incorporating strategies to address each of the model components. The model also provided a framework for evaluation of this intervention within parallel randomised controlled trials in the two exemplar conditions, and the accompanying process evaluations and economic evaluations. Primary care. The TElehealth in CHronic Disease (TECH) model proposes that attention to four components will offer interventions the best chance of success: (1) engagement of patients and health professionals, (2) effective chronic disease management (including subcomponents of self-management, optimisation of treatment, care coordination), (3) partnership between providers and (4) patient, social and health system context. Key intended outcomes are improved health, access to care, patient experience and cost-effective care. A conceptual model has been developed based on multiple sources of evidence which articulates how telehealth may best provide benefits for patients with chronic health conditions. It can be used to structure the design and evaluation of telehealth programmes which aim to be acceptable to patients and providers, and cost-effective. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Rule based artificial intelligence expert system for determination of upper extremity impairment rating.

    PubMed

    Lim, I; Walkup, R K; Vannier, M W

    1993-04-01

    Quantitative evaluation of upper extremity impairment, a percentage rating most often determined using a rule based procedure, has been implemented on a personal computer using an artificial intelligence, rule-based expert system (AI system). In this study, the rules given in Chapter 3 of the AMA Guides to the Evaluation of Permanent Impairment (Third Edition) were used to develop such an AI system for the Apple Macintosh. The program applies the rules from the Guides in a consistent and systematic fashion. It is faster and less error-prone than the manual method, and the results have a higher degree of precision, since intermediate values are not truncated.

  10. Evaluating Education and Science at the KSC Visitor Complex

    NASA Technical Reports Server (NTRS)

    Erickson, Lance K.

    2002-01-01

    As part of a two-year NASA-ASEE project, a preliminary evaluation and subsequent recommendations were developed to improve the education and science content of the Kennedy Space Center Visitor Complex exhibits. Recommendations for improvements in those exhibits were based on qualitative descriptions of the exhibits, on comparisons to similar exhibit collections, and on available evaluation processes. Because of the subjective nature of measuring content in a broad group of exhibits and displays, emphasis is placed on employing a survey format for a follow-on, more quantitative evaluation. The use of an external organization for this evaluation development is also recommended to reduce bias and increase validity.

  11. Evaluating Education and Science at the KSC Visitor Complex

    NASA Technical Reports Server (NTRS)

    Erickson, Lance K.

    2001-01-01

    As part of a two-year NASA-ASEE project, a preliminary evaluation and subsequent recommendations were developed to improve the education and science content of the Kennedy Space Center Visitor Complex exhibits. Recommendations for improvements in those exhibits were based on qualitative descriptions of the exhibits, on comparisons to similar exhibit collections, and on available evaluation processes. Because of the subjective nature of measuring content in a broad group of exhibits and displays, emphasis is placed on employing a survey format for a follow-on, more quantitative evaluation. The use of an external organization for this evaluation development is also recommended to reduce bias and increase validity.

  12. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  13. Using Mixed Methods to Evaluate a Community Intervention for Sexual Assault Survivors: A Methodological Tale.

    PubMed

    Campbell, Rebecca; Patterson, Debra; Bybee, Deborah

    2011-03-01

    This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.

  14. A quantitative measure of the electrical activity of human rod photoreceptors using electroretinography.

    PubMed

    Hood, D C; Birch, D G

    1990-10-01

    An electrical potential recorded from the cornea, the a-wave of the ERG, is evaluated as a measure of human photoreceptor activity by comparing its behavior to a model derived from in vitro recordings from rod photoreceptors. The leading edge of the ERG exhibits both the linear and nonlinear behavior predicted by this model. The capability for recording the electrical activity of human photoreceptors in vivo opens new avenues for assessing normal and abnormal receptor activity in humans. Furthermore, the quantitative model of the receptor response can be used to isolate the inner retinal contribution, Granit's PII, to the gross ERG. Based on this analysis, the practice of using the trough-to-peak amplitude of the b-wave as a proxy for the amplitude of the inner nuclear layer activity is evaluated.

  15. The Investigation on Strain Strengthening Induced Martensitic Phase Transformation of Austenitic Stainless Steel: A Fundamental Research for the Quality Evaluation of Strain Strengthened Pressure Vessel

    NASA Astrophysics Data System (ADS)

    Li, Bo; Cai Ren, Fa; Tang, Xiao Ying

    2018-03-01

    The manufacture of pressure vessels with austenitic stainless steel strain strengthening technology has become an important technical means for the light weight of cryogenic pressure vessels. In the process of increasing the strength of austenitic stainless steel, strain can induce the martensitic phase transformation in austenite phase. There is a quantitative relationship between the transformation quantity of martensitic phase and the basic mechanical properties. Then, the martensitic phase variables can be obtained by means of detection, and the mechanical properties and safety performance are evaluated and calculated. Based on this, the quantitative relationship between strain hardening and deformation induced martensite phase content is studied in this paper, and the mechanism of deformation induced martensitic transformation of austenitic stainless steel is detailed.

  16. Evaluation of the capabilities of satellite imagery for monitoring regional air pollution episodes

    NASA Technical Reports Server (NTRS)

    Barnes, J. C.; Bowley, C. J.; Burke, H. H. K.

    1979-01-01

    A comparative analysis of satellite visible channel imagery and ground based aerosol measurements is carried out for three cases representing a significant pollution episodes based on low surface visibility and high sulfate levels. The feasibility of detecting pollution episodes from space is also investigated using a simulation model. The model results are compared to quantitative information derived from digitized satellite data. The results show that when levels are or = 30 micrograms/cu, a haze pattern that correlates closely with the area of reported low surface visibilities and high micrograms sulfate levels can be detected in satellite visible channel imagery. The model simulation demonstrates the potential of the satellite to monitor the magnitude and areal extent of pollution episodes. Quantitative information on total aerosol amount derived from the satellite digitized data using the atmospheric radiative transfer model agrees well with the results obtained from the ground based measurements.

  17. Study On The Application Of CBERS-02B To Quantitative Soil Erosion Monitoring

    NASA Astrophysics Data System (ADS)

    Shi, Mingchang; Xu, Jing; Wang, Lei; Wang, Xiaoyun; Mu, Jing

    2010-10-01

    Currently, the reduction of soil erosion is an important prerequisite for achieving ecological security. Since real-time and quantitative evaluation on regional soil erosion plays a significant role in reducing the soil erosion, soil erosion models are more and more widely used. Based on RUSLE model, this paper carries out the quantitative soil erosion monitoring in the Xi River Basin and its surrounding areas by using CBERS-02B CCD, DEM, TRMM and other data. Besides, it performs the validation for monitoring results by using remote sensing investigation results in 2005. The monitoring results show that in 2009, the total amount of soil erosion in the study area was 1.94×106t, the erosion area was 2055.2km2 (54.06% of the total area), and the average soil erosion modulus was 509.7t km-2 a-1. As a case using CBERS-02B data for quantitative soil erosion monitoring, this study provides experience on the application of CBERS-02B data in the field of quantitative soil erosion monitoring and also for local soil erosion management.

  18. Single-case synthesis tools II: Comparing quantitative outcome measures.

    PubMed

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Evaluation of a dual-probe real time PCR system for detection of mandarin in commercial orange juice.

    PubMed

    Pardo, Miguel Angel

    2015-04-01

    A dual-probe real time PCR assay, based on the simultaneous detection of two TaqMan® probes, was evaluated for the detection of mandarin in orange juice. A single conserved polymorphism, located at the 314 position of intron belongs to chloroplast trnL gene, was confirmed by sequencing in 30 mandarin, 28 orange cultivars and 13 hybrids. The assay was also successfully evaluated in a blind trial against analysing 60 samples from different industrial processes in different countries around the world. The detection limit of the assay was established in 1% presence of mandarin detectable in processed orange juice and with a 100% precision. The quantitative application of the assay on citrus mixtures was also investigated, pointing out that the number of chloroplast DNA copies is too variable for its possible use as quantitative analysis. This assay can be employed as a routine methodology to control the accidental mixing during industrial processes and to deter intentional fraud. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. NIST Efforts to Quality-Assure Gunpowder Measurements

    NASA Technical Reports Server (NTRS)

    MacCrehan, William A.; Reardon, Michelle R.

    2000-01-01

    In the past few years, the National Institute for Standards and Technology (NIST) has been promoting the idea of quantitatively determining the additives in smokeless gunpowder using micellar capillary electrophoresis as a means of investigating the criminal use of hand guns and pipe bombs. As a part of this effort, we have evaluated both supercritical fluid and ultrasonic solvent extractions for the quantitative recovery of nitroglycerin (NG), diphenylamine (DPA), N-nitrosodiphenylamine (NnDPA), and ethyl centralite (EC) from gunpowder. Recoveries were evaluated by repeat extraction and matrix spiking experiments. The final extraction protocol provides greater than 95 percent recoveries. To help other researches validate their own analytical method for additive determinations, NIST is exploring the development of a standard reference material, Additives in Smokeless Gunpowder. The evaluated method is being applied to two double-base (NG-containing) powders, one stabilized with diphenylamine and the other with ethyl centralite. As part of this reference material development effort, we are conducting an interlaboratory comparison exercise among the forensic and military gunpowder measurement community.

  1. Coatomer subunit beta 2 (COPB2), identified by label-free quantitative proteomics, regulates cell proliferation and apoptosis in human prostate carcinoma cells.

    PubMed

    Mi, Yuanyuan; Sun, Chuanyu; Wei, Bingbing; Sun, Feiyu; Guo, Yijun; Hu, Qingfeng; Ding, Weihong; Zhu, Lijie; Xia, Guowei

    2018-01-01

    Label-free quantitative proteomics has broad applications in the identification of differentially expressed proteins. Here, we applied this method to identify differentially expressed proteins (such as coatomer subunit beta 2 [COPB2]) and evaluated the functions and molecular mechanisms of these proteins in prostate cancer (PCA) cell proliferation. Proteins extracted from surgically resected PCA tissues and adjacent tissues of 3 patients were analyzed by label-free quantitative proteomics. The target protein was confirmed by bioinformatics and GEO dataset analyses. To investigate the role of the target protein in PCA, we used lentivirus-mediated small-interfering RNA (siRNA) to knockdown protein expression in the prostate carcinoma cell line, CWR22RV1 cells and assessed gene and protein expression by reverse transcription quantitative polymerase chain reaction and western blotting. CCK8 and colony formation assays were conducted to evaluate cell proliferation. Cell cycle distributions and apoptosis were assayed by flow cytometry. We selected the differentiation-related protein COPB2 as our target protein based on the results of label-free quantitative proteomics. High expression of COPB2 was found in PCA tissue and was related to poor overall survival based on a public dataset. Cell proliferation was significantly inhibited in COPB2-knockdown CWR22RV1 cells, as demonstrated by CCK8 and colony formation assays. Additionally, the apoptosis rate and percentage of cells in the G 1 phase were increased in COPB2-knockdown cells compared with those in control cells. CDK2, CDK4, and cyclin D1 were downregulated, whereas p21 Waf1/Cip1 and p27 Kip1 were upregulated, affecting the cell cycle signaling pathway. COPB2 significantly promoted CWR22RV1 cell proliferation through the cell cycle signaling pathway. Thus, silencing of COPB2 may have therapeutic applications in PCA. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. 76 FR 72474 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... has developed a proprietary SectorSAM \\TM\\ quantitative research and evaluation process that forecasts... and short portfolios as dictated by its proprietary SectorSAM quantitative research and evaluation... a proprietary quantitative analysis, to forecast each sector's excess return within a specific time...

  3. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  4. The Numbers Tell It All: Students Don't Like Numbers!

    PubMed Central

    Uttl, Bob; White, Carmela A.; Morin, Alain

    2013-01-01

    Undergraduate Students' interest in taking quantitative vs. non quantitative courses has received limited attention even though it has important consequences for higher education. Previous studies have collected course interest ratings at the end of the courses as part of student evaluation of teaching (SET) ratings, which may confound prior interest in taking these courses with students' actual experience in taking them. This study is the first to examine undergraduate students' interest in quantitative vs. non quantitative courses in their first year of studies before they have taken any quantitative courses. Three hundred and forty students were presented with descriptions of 44 psychology courses and asked to rate their interest in taking each course. Student interest in taking quantitative vs non quantitative courses was very low; the mean interest in statistics courses was nearly 6 SDs below the mean interest in non quantitative courses. Moreover, women were less interested in taking quantitative courses than men. Our findings have several far-reaching implications. First, evaluating professors teaching quantitative vs. non quantitative courses against the same SET standard may be inappropriate. Second, if the same SET standard is used for the evaluation of faculty teaching quantitative vs. non quantitative courses, faculty are likely to teach to SETs rather than focus on student learning. Third, universities interested primarily in student satisfaction may want to expunge quantitative courses from their curricula. In contrast, universities interested in student learning may want to abandon SETs as a primary measure of faculty teaching effectiveness. Fourth, undergraduate students who are not interested in taking quantitative courses are unlikely to pursue graduate studies in quantitative psychology and unlikely to be able to competently analyze data independently. PMID:24358284

  5. Quantitative breast tissue characterization using grating-based x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Willner, M.; Herzen, J.; Grandl, S.; Auweter, S.; Mayr, D.; Hipp, A.; Chabior, M.; Sarapata, A.; Achterhold, K.; Zanette, I.; Weitkamp, T.; Sztrókay, A.; Hellerhoff, K.; Reiser, M.; Pfeiffer, F.

    2014-04-01

    X-ray phase-contrast imaging has received growing interest in recent years due to its high capability in visualizing soft tissue. Breast imaging became the focus of particular attention as it is considered the most promising candidate for a first clinical application of this contrast modality. In this study, we investigate quantitative breast tissue characterization using grating-based phase-contrast computed tomography (CT) at conventional polychromatic x-ray sources. Different breast specimens have been scanned at a laboratory phase-contrast imaging setup and were correlated to histopathology. Ascertained tumor types include phylloides tumor, fibroadenoma and infiltrating lobular carcinoma. Identified tissue types comprising adipose, fibroglandular and tumor tissue have been analyzed in terms of phase-contrast Hounsfield units and are compared to high-quality, high-resolution data obtained with monochromatic synchrotron radiation, as well as calculated values based on tabulated tissue properties. The results give a good impression of the method’s prospects and limitations for potential tumor detection and the associated demands on such a phase-contrast breast CT system. Furthermore, the evaluated quantitative tissue values serve as a reference for simulations and the design of dedicated phantoms for phase-contrast mammography.

  6. Smartphone-Based Dual-Modality Imaging System for Quantitative Detection of Color or Fluorescent Lateral Flow Immunochromatographic Strips

    NASA Astrophysics Data System (ADS)

    Hou, Yafei; Wang, Kan; Xiao, Kun; Qin, Weijian; Lu, Wenting; Tao, Wei; Cui, Daxiang

    2017-04-01

    Nowadays, lateral flow immunochromatographic assays are increasingly popular as a diagnostic tool for point-of-care (POC) test based on their simplicity, specificity, and sensitivity. Hence, quantitative detection and pluralistic popular application are urgently needed in medical examination. In this study, a smartphone-based dual-modality imaging system was developed for quantitative detection of color or fluorescent lateral flow test strips, which can be operated anywhere at any time. In this system, the white and ultra-violet (UV) light of optical device was designed, which was tunable with different strips, and the Sobel operator algorithm was used in the software, which could enhance the identification ability to recognize the test area from the background boundary information. Moreover, this technology based on extraction of the components from RGB format (red, green, and blue) of color strips or only red format of the fluorescent strips can obviously improve the high-signal intensity and sensitivity. Fifty samples were used to evaluate the accuracy of this system, and the ideal detection limit was calculated separately from detection of human chorionic gonadotropin (HCG) and carcinoembryonic antigen (CEA). The results indicated that smartphone-controlled dual-modality imaging system could provide various POC diagnoses, which becomes a potential technology for developing the next-generation of portable system in the near future.

  7. Steps in the Right Direction, against the Odds, an Evaluation of a Community-Based Programme Aiming to Reduce Inactivity and Improve Health and Morale in Overweight and Obese School-Age Children

    ERIC Educational Resources Information Center

    Fraser, Claire; Lewis, Kiara; Manby, Martin

    2012-01-01

    The study describes an evaluation of a 48-week physical activity and nutritional education programme for overweight/obese school-age children using quantitative and qualitative methods. The majority of participants were obese or severely obese when enrolled, and while some improvements in body mass index, self-esteem and engagement in a range of…

  8. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.

  9. Evaluation of a rule-based compositing technique for Landsat-5 TM and Landsat-7 ETM+ images

    NASA Astrophysics Data System (ADS)

    Lück, W.; van Niekerk, A.

    2016-05-01

    Image compositing is a multi-objective optimization process. Its goal is to produce a seamless cloud and artefact-free artificial image. This is achieved by aggregating image observations and by replacing poor and cloudy data with good observations from imagery acquired within the timeframe of interest. This compositing process aims to minimise the visual artefacts which could result from different radiometric properties, caused by atmospheric conditions, phenologic patterns and land cover changes. It has the following requirements: (1) image compositing must be cloud free, which requires the detection of clouds and shadows, and (2) the image composite must be seamless, minimizing artefacts and visible across inter image seams. This study proposes a new rule-based compositing technique (RBC) that combines the strengths of several existing methods. A quantitative and qualitative evaluation is made of the RBC technique by comparing it to the maximum NDVI (MaxNDVI), minimum red (MinRed) and maximum ratio (MaxRatio) compositing techniques. A total of 174 Landsat TM and ETM+ images, covering three study sites and three different timeframes for each site, are used in the evaluation. A new set of quantitative/qualitative evaluation techniques for compositing quality measurement was developed and showed that the RBC technique outperformed all other techniques, with MaxRatio, MaxNDVI, and MinRed techniques in order of performance from best to worst.

  10. Evaluation of breeding strategies for polledness in dairy cattle using a newly developed simulation framework for quantitative and Mendelian traits.

    PubMed

    Scheper, Carsten; Wensch-Dorendorf, Monika; Yin, Tong; Dressel, Holger; Swalve, Herrmann; König, Sven

    2016-06-29

    Intensified selection of polled individuals has recently gained importance in predominantly horned dairy cattle breeds as an alternative to routine dehorning. The status quo of the current polled breeding pool of genetically-closely related artificial insemination sires with lower breeding values for performance traits raises questions regarding the effects of intensified selection based on this founder pool. We developed a stochastic simulation framework that combines the stochastic simulation software QMSim and a self-designed R program named QUALsim that acts as an external extension. Two traits were simulated in a dairy cattle population for 25 generations: one quantitative (QMSim) and one qualitative trait with Mendelian inheritance (i.e. polledness, QUALsim). The assignment scheme for qualitative trait genotypes initiated realistic initial breeding situations regarding allele frequencies, true breeding values for the quantitative trait and genetic relatedness. Intensified selection for polled cattle was achieved using an approach that weights estimated breeding values in the animal best linear unbiased prediction model for the quantitative trait depending on genotypes or phenotypes for the polled trait with a user-defined weighting factor. Selection response for the polled trait was highest in the selection scheme based on genotypes. Selection based on phenotypes led to significantly lower allele frequencies for polled. The male selection path played a significantly greater role for a fast dissemination of polled alleles compared to female selection strategies. Fixation of the polled allele implies selection based on polled genotypes among males. In comparison to a base breeding scenario that does not take polledness into account, intensive selection for polled substantially reduced genetic gain for this quantitative trait after 25 generations. Reducing selection intensity for polled males while maintaining strong selection intensity among females, simultaneously decreased losses in genetic gain and achieved a final allele frequency of 0.93 for polled. A fast transition to a completely polled population through intensified selection for polled was in contradiction to the preservation of high genetic gain for the quantitative trait. Selection on male polled genotypes with moderate weighting, and selection on female polled phenotypes with high weighting, could be a suitable compromise regarding all important breeding aspects.

  11. Evaluation of Aution Max AX-4030 and 9UB Uriflet, 10PA Aution Sticks urine dipsticks in the automated urine test strip analysis.

    PubMed

    Rota, Cristina; Biondi, Marco; Trenti, Tommaso

    2011-09-26

    Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.

  12. Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines

    ERIC Educational Resources Information Center

    Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.

    2015-01-01

    Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…

  13. A Quantitative Analysis of Language Interventions for Children with Autism

    ERIC Educational Resources Information Center

    Kane, Meghan; Connell, James E.; Pellecchia, Melanie

    2010-01-01

    Research and services continue to expand to community-based programs serving individuals diagnosed with autism. A focus of great interest in those efforts is that of language acquisition and functional usage. For the purpose of this evaluation, language acquisition interventions are grouped into two broad categories, contrived and naturalistic.…

  14. Evaluation of fecal indicator and pathogenic bacteria originating from swine manure applied to agricultural lands using culture-based and quantitative real-time PCR methods.

    EPA Science Inventory

    Fecal bacteria, including those originating from concentrated animal feeding operations, are a leading contributor to water quality impairments in agricultural areas. Rapid and reliable methods are needed that can accurately characterize fecal pollution in agricultural settings....

  15. Evaluation of Fecal Indicator and Pathogenic Bacteria Originating from Swine Manure Applied to Agricultural Lands Using Culture-Based and Quantitative Real-Time PCR Methods

    EPA Science Inventory

    Fecal bacteria, including those originating from concentrated animal feeding operations, are a leading contributor to water quality impairments in agricultural areas. Rapid and reliable methods are needed that can accurately characterize fecal pollution in agricultural settings....

  16. Safeguards Technology Development Program 1st Quarter FY 2018 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Manoj K.

    LLNL will evaluate the performance of a stilbene-based scintillation detector array for IAEA neutron multiplicity counting (NMC) applications. This effort will combine newly developed modeling methodologies and recently acquired high-efficiency stilbene detector units to quantitatively compare the prototype system performance with the conventional He-3 counters and liquid scintillator alternatives.

  17. Initial Validity Evidence for the State Mindfulness Scale for Physical Activity with Youth

    ERIC Educational Resources Information Center

    Ullrich-French, Sarah; Cox, Anne; Cole, Amy; Rhoades Cooper, Brittany; Gotch, Chad

    2017-01-01

    Experiencing mindfulness during movement-based interventions (e.g., yoga) may help support adaptive physical activity motivation processes in youth. However, there is currently no measure for assessing state mindfulness with youth within the context of physical activity. The purpose of this study was to qualitatively and quantitatively evaluate a…

  18. Implementing Adolescent Male Leadership Model to Enhance Behavior, Academic Success

    ERIC Educational Resources Information Center

    Beliele, Laressa

    2012-01-01

    Schools are challenged to assist struggling youth. This study used a mixed methods design to evaluate how the school-based program Men of Distinction helps struggling male students develop leadership skills, promoting academic and social success. Quantitative data included attendance, grade point averages, the number of days in in-school…

  19. Observational Learning among Older Adults Living in Nursing Homes

    ERIC Educational Resources Information Center

    Story, Colleen D.

    2010-01-01

    The purpose of this study was to evaluate learning by older adults living in nursing homes through observational learning based on Bandura's (1977) social learning theory. This quantitative study investigated if older adults could learn through observation. The nursing homes in the study were located in the midwestern United States. The…

  20. 34 CFR 263.6 - How does the Secretary evaluate applications for the Professional Development program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... research and effective practices on how to improve teaching and learning to support student proficiency in...-based research and effective practice; (2) The extent to which the training or professional development... will produce both quantitative and qualitative data to the extent possible. (Approved by the Office of...

Top