Science.gov

Sample records for quantitative performance evaluation

  1. Quantitative evaluation of performance of three-dimensional printed lenses

    NASA Astrophysics Data System (ADS)

    Gawedzinski, John; Pawlowski, Michal E.; Tkaczyk, Tomasz S.

    2017-08-01

    We present an analysis of the shape, surface quality, and imaging capabilities of custom three-dimensional (3-D) printed lenses. 3-D printing technology enables lens prototypes to be fabricated without restrictions on surface geometry. Thus, spherical, aspherical, and rotationally nonsymmetric lenses can be manufactured in an integrated production process. This technique serves as a noteworthy alternative to multistage, labor-intensive, abrasive processes, such as grinding, polishing, and diamond turning. Here, we evaluate the quality of lenses fabricated by Luxexcel using patented Printoptical©; technology that is based on an inkjet printing technique by comparing them to lenses made with traditional glass processing technologies (grinding, polishing, etc.). The surface geometry and roughness of the lenses were evaluated using white-light and Fizeau interferometers. We have compared peak-to-valley wavefront deviation, root mean square (RMS) wavefront error, radii of curvature, and the arithmetic roughness average (Ra) profile of plastic and glass lenses. In addition, the imaging performance of selected pairs of lenses was tested using 1951 USAF resolution target. The results indicate performance of 3-D printed optics that could be manufactured with surface roughness comparable to that of injection molded lenses (Ra<20 nm). The RMS wavefront error of 3-D printed prototypes was at a minimum 18.8 times larger than equivalent glass prototypes for a lens with a 12.7 mm clear aperture, but, when measured within 63% of its clear aperture, the 3-D printed components' RMS wavefront error was comparable to glass lenses.

  2. Quantitative evaluation of wrist posture and typing performance: A comparative study of 4 computer keyboards

    SciTech Connect

    Burastero, S.

    1994-05-01

    The present study focuses on an ergonomic evaluation of 4 computer keyboards, based on subjective analyses of operator comfort and on a quantitative analysis of typing performance and wrist posture during typing. The objectives of this study are (1) to quantify differences in the wrist posture and in typing performance when the four different keyboards are used, and (2) to analyze the subjective preferences of the subjects for alternative keyboards compared to the standard flat keyboard with respect to the quantitative measurements.

  3. A General Quantitative Method for Evaluating the Visual Significance of Reflected Glare, Utilizing Visual Performance Data.

    ERIC Educational Resources Information Center

    Blackwell, H. Richard

    1963-01-01

    The results of all basic measurements and calculations of reflected glare for different lighting materials and conditions are presented in a series of tables and charts. All basic concepts of a quantitative method for evaluating the visual significance of reflected glare are identified in relationship to different types of visual performance. The…

  4. Quantitative evaluation on the performance and feature enhancement of stochastic resonance for bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Li, Guoying; Li, Jimeng; Wang, Shibin; Chen, Xuefeng

    2016-12-01

    Stochastic resonance (SR) has been widely applied in the field of weak signal detection by virtue of its characteristic of utilizing noise to amplify useful signal instead of eliminating noise in nonlinear dynamical systems. How to quantitatively evaluate the performance of SR, including the enhancement effect and the degree of waveform distortion, and how to accurately extract signal amplitude have become two important issues in the research on SR. In this paper, the signal-to-noise ratio (SNR) of the main component to the residual in the SR output is constructed to quantitatively measure the enhancement effect of the SR method. And two indices are constructed to quantitatively measure the degree of waveform distortion of the SR output, including the correlation coefficient between the main component in the SR output and the original signal, and the zero-crossing ratio. These quantitative indices are combined to provide a comprehensive quantitative index for adaptive parameter selection of the SR method, and eventually the adaptive SR method can be effective in enhancing the weak component hidden in the original signal. Fast Fourier Transform and Fourier Transform (FFT+FT) spectrum correction technology can extract the signal amplitude from the original signal and effectively reduce the difficulty of extracting signal amplitude from the distorted resonance output. The application in vibration analysis for bearing fault diagnosis verifies that the proposed quantitative evaluation method for adaptive SR can effectively detect weak fault feature of the vibration signal during the incipient stage of bearing fault.

  5. Quantitative performance evaluation of the EM algorithm applied to radiographic images

    NASA Astrophysics Data System (ADS)

    Brailean, James C.; Giger, Maryellen L.; Chen, Chin-Tu; Sullivan, Barry J.

    1991-07-01

    In this study, the authors evaluate quantitatively the performance of the Expectation Maximization (EM) algorithm as a restoration technique for radiographic images. The 'perceived' signal-to-nose ratio (SNR), of simple radiographic patterns processed by the EM algorithm are calculated on the basis of a statistical decision theory model that includes both the observer's visual response function and a noise component internal to the eye-brain system. The relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to quantitatively compare the effects of the EM algorithm to two popular image enhancement techniques: contrast enhancement (windowing) and unsharp mask filtering.

  6. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  7. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Quantitative Performance Evaluator for Proteomics (QPEP): Web-based Application for Reproducible Evaluation of Proteomics Preprocessing Methods.

    PubMed

    Strbenac, Dario; Zhong, Ling; Raftery, Mark J; Wang, Penghao; Wilson, Susan R; Armstrong, Nicola J; Yang, Jean Y H

    2017-07-07

    Tandem mass spectrometry is one of the most popular techniques for quantitation of proteomes. There exists a large variety of options in each stage of data preprocessing that impact the bias and variance of the summarized protein-level values. Using a newly released data set satisfying a replicated Latin squares design, a diverse set of performance metrics has been developed and implemented in a web-based application, Quantitative Performance Evaluator for Proteomics (QPEP). QPEP has the flexibility to allow users to apply their own method to preprocess this data set and share the results, allowing direct and straightforward comparison of new methodologies. Application of these new metrics to three case studies highlights that (i) the summarization of peptides to proteins is robust to the choice of peptide summary used, (ii) the differences between iTRAQ labels are stronger than the differences between experimental runs, and (iii) the commercial software ProteinPilot performs equivalently well at between-sample normalization to more complicated methods developed by academics. Importantly, finding (ii) underscores the benefits of using the principles of randomization and blocking to avoid the experimental measurements being confounded by technical factors. Data are available via ProteomeXchange with identifier PXD003608.

  9. Evaluation of fourier transform profilometry performance: quantitative waste volume determination under simulated Hanford waste tank conditions

    SciTech Connect

    Jang, Ping-Rey; Leone, Teresa; Long, Zhiling; Mott, Melissa A.; Perry Norton, O.; Okhuysen, Walter P.; Monts, David L.

    2007-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We have completed a preliminary performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. Based on a Hanford C-200 series tank with camera access through a riser with significant offset relative to the centerline, we devised a testing methodology that encompassed a range of obstacles likely to be encountered 'in tank'. These test objects were inspected by use

  10. Quantitation of bone mineral by dual photon absorptiometry (DPA): Evaluation of instrument performance

    SciTech Connect

    Dunn, W.L.; O'Duffy, A.; Wahner, H.W.

    1984-01-01

    Quantitation of bone mineral is used with increasing frequency for clinical studies. This paper details the principle of DPA and present an evaluation of the technique. DPA measurements were performed with a scanning dual photon system constructed at this institution and modeled after the device developed at the University of Wisconsin. The components are a rectilinear scanner frame, 1.5 Ci Gd-153 source, NaI(TL) detector and a PDP 11/03 computer. Dual discriminator windows are set on the 44 and 100 keV photon energies of Gd-153. Instrument linearity, accuracy and reproducibility were evaluated with ashed bone standards and simulated tissue covering. In these experiments computed and actual bone mineral have a correlation coefficient of 1.0 and a SEE of approximately 1.0% (Linear regression analysis). Precision and accuracy of a standard were studied over a period of two years. Mean error between actual and measured bone mineral was 0.28%. In vivo precision in six subjects averaged 2.3% (CV) for lumbar spine measurements. The effect of soft tissue compositional change was studied with ashed bone standards and human cadaver spine specimens. Intraosseous fat changes of 50% produced an average bone mineral measurement error of 1.4%. A 20% change in fat thickness produced a 2.5% error. In situ and in vitro scans of 9 cadaver spines were performed to study the effect of extraosseous fat. The mean percent difference between the two measurements was 0.7% (SEE=3.2%).

  11. Performance evaluation of a commercial system for quantitative measurement of display resolution

    NASA Astrophysics Data System (ADS)

    Cleland, Esi W.; Samei, Ehsan

    2006-03-01

    One of the key metrics that carry information about image quality of medical displays is resolution. Until now, this property has been quantitatively assessed in laboratory settings. For the first time, a device consisting of a CCD camera and analysis software has been made commercially available for measuring the resolution of medical displays in a clinical setting. This study aimed to evaluate this new product in terms of accuracy and precision. In particular, attention was paid to determine whether the device is appropriate for clinical use. This work involved the measurement of the Modulation Transfer Function (MTF) of a medical Liquid Crystal Display (LCD) using the software/camera system. To check for accuracy, the results were compared with published values of the resolution for the same display. To assess the system's precision, measurements were made multiple times at the same setting. The performance of the system was also ascertained as a function of the focus setting of the camera. In terms of repeatability, the results indicate that when the camera is focused within +/- 0.64 mm of the optimum focus setting, the MTF values lie within approximately 14% of the best focus MTF at the Nyquist frequency and 11% of the optimum total sharpness (∫MTF df). Similar results were obtained in the horizontal and vertical directions. Also, the MTF results track with luminance values as expected. In terms of accuracy, the device provides MTF figures within 10% to 20% of the previously measured values.

  12. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  13. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems.

    PubMed

    Junker, Astrid; Muraya, Moses M; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E; Meyer, Rhonda C; Riewe, David; Altmann, Thomas

    2014-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications.

  14. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID

  15. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  16. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications

    PubMed Central

    Wei, Wentao; Huang, Qiu; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring. PMID:28251150

  17. A quantitative method to evaluate the performance of topographic correction models used to improve land cover identification

    NASA Astrophysics Data System (ADS)

    Park, Sung-Hwan; Jung, Hyung-Sup; Choi, Jaewon; Jeon, Seongwoo

    2017-10-01

    Topographic correction methods have been widely used prior to land cover identification in sloping terrain because the topographic variation on the Earth's surface can interfere with the classifications. The topographic correction involves the normalization of brightness or surface reflectance values from the slanted to the horizontal plane. Several topographic correction models have been proposed, and a quantitative evaluation method is needed for these models because the performance can vary according to the surface cover types and spectral bands. In this study, we proposed an efficient method to evaluate the performance of topographic correction models through measuring the histogram structural similarity (HSSIM) index estimated from the sunlit and sun-shaded slope areas before and after the correction. We tested the HSSIM index by using three different land cover types derived from Landsat-8 Operational Land Imager (OLI) images and eight commonly used topographic correction models. When the proposed HSSIM index was compared with the visual analysis technique, the results matched exactly. Using the HSSIM index, the best correction methods were then determined, and the best ones included the statistical-empirical or SCS+C methods (where SCS+C refers to the sun-canopy-sensor plus C-correction) for the R, G, and B bands and the Minnaert+SCS method for the NIR, SWIR-1, and SWIR-2 bands. These results indicate that (i) the HSSIM index enables quantitative performance evaluations of topographic correction models and (ii) the HSSIM index can be used to determine the best topographic correction method for particular land cover identification applications.

  18. Three-year randomised clinical trial to evaluate the clinical performance, quantitative and qualitative wear patterns of hybrid composite restorations

    PubMed Central

    Palaniappan, Senthamaraiselvi; Elsen, Liesbeth; Lijnen, Inge; Peumans, Marleen; Van Meerbeek, Bart

    2009-01-01

    The aim of the study was to compare the clinical performance, quantitative and qualitative wear patterns of conventional hybrid (Tetric Ceram), micro-filled hybrid (Gradia Direct Posterior) and nano-hybrid (Tetric EvoCeram, TEC) posterior composite restorations in a 3-year randomised clinical trial. Sixteen Tetric Ceram, 17 TEC and 16 Gradia Direct Posterior restorations were placed in human molars and evaluated at baseline, 6, 12, 24 and 36 months of clinical service according to US Public Health Service criteria. The gypsum replicas at each recall were used for 3D laser scanning to quantify wear, and the epoxy resin replicas were observed under scanning electron microscope to study the qualitative wear patterns. After 3 years of clinical service, the three hybrid restorative materials performed clinically well in posterior cavities. Within the observation period, the nano-hybrid and micro-hybrid restorations evolved better in polishability with improved surface gloss retention than the conventional hybrid counterpart. The three hybrid composites showed enamel-like vertical wear and cavity-size dependant volume loss magnitude. Qualitatively, while the micro-filled and nano-hybrid composite restorations exhibited signs of fatigue similar to the conventional hybrid composite restorations at heavy occlusal contact area, their light occlusal contact areas showed less surface pitting after 3 years of clinical service. PMID:19669176

  19. Three-year randomised clinical trial to evaluate the clinical performance, quantitative and qualitative wear patterns of hybrid composite restorations.

    PubMed

    Palaniappan, Senthamaraiselvi; Elsen, Liesbeth; Lijnen, Inge; Peumans, Marleen; Van Meerbeek, Bart; Lambrechts, Paul

    2010-08-01

    The aim of the study was to compare the clinical performance, quantitative and qualitative wear patterns of conventional hybrid (Tetric Ceram), micro-filled hybrid (Gradia Direct Posterior) and nano-hybrid (Tetric EvoCeram, TEC) posterior composite restorations in a 3-year randomised clinical trial. Sixteen Tetric Ceram, 17 TEC and 16 Gradia Direct Posterior restorations were placed in human molars and evaluated at baseline, 6, 12, 24 and 36 months of clinical service according to US Public Health Service criteria. The gypsum replicas at each recall were used for 3D laser scanning to quantify wear, and the epoxy resin replicas were observed under scanning electron microscope to study the qualitative wear patterns. After 3 years of clinical service, the three hybrid restorative materials performed clinically well in posterior cavities. Within the observation period, the nano-hybrid and micro-hybrid restorations evolved better in polishability with improved surface gloss retention than the conventional hybrid counterpart. The three hybrid composites showed enamel-like vertical wear and cavity-size dependant volume loss magnitude. Qualitatively, while the micro-filled and nano-hybrid composite restorations exhibited signs of fatigue similar to the conventional hybrid composite restorations at heavy occlusal contact area, their light occlusal contact areas showed less surface pitting after 3 years of clinical service.

  20. Performance evaluation of fourier transform profilometry for quantitative waste volume determination under simulated hanford waste tank conditions

    SciTech Connect

    Jang, P.R.; Leone, T.; Long, Z.; Mott, M.A.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.

    2007-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of the residual waste. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We have completed a preliminary performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. Based on a Hanford C-200 series tank with camera access through a riser with significant offset relative to the centerline, we devised a testing methodology that encompassed a range of obstacles likely to be encountered 'in-tank'. These test objects were inspected by use of FTP and the volume of

  1. Evaluating quantitative research reports.

    PubMed

    Russell, Cynthia L

    2005-01-01

    As a novice reviewer, it is often difficult to trust your evaluation of a research report. You may feel uncertain in your interpretations. These are common concerns and can be remedied by reading and discussing research reports on research listservs, through journal clubs, or with other nephrology nurses. Practice using the criteria for research report evaluation and you too can perfect critiquing a research report!

  2. Community Health Workers to Improve Antenatal Care and PMTCT Uptake in Dar es Salaam, Tanzania: A Quantitative Performance Evaluation

    PubMed Central

    Sando, David; Magesa, Lucy; Machumi, Lameck; Mungure, Esther; Mwanyika Sando, Mary; Geldsetzer, Pascal; Foster, Dawn; Kajoka, Deborah; Naburi, Helga; Ekström, Anna M.; Spiegelman, Donna; Li, Nan; Chalamilla, Guerino; Fawzi, Wafaie; Bärnighausen, Till

    2014-01-01

    Background: Home visits by community health workers (CHW) could be effective in identifying pregnant women in the community before they have presented to the health system. CHW could thus improve the uptake of antenatal care (ANC), HIV testing, and prevention of mother-to-child transmission (PMTCT) services. Methods: Over a 16-month period, we carried out a quantitative evaluation of the performance of CHW in reaching women early in pregnancy and before they have attended ANC in Dar es Salaam, Tanzania. Results: As part of the intervention, 213 CHW conducted more than 45,000 home visits to about 43,000 pregnant women. More than 75% of the pregnant women identified through home visits had not yet attended ANC at the time of the first contact with a CHW and about 40% of those who had not yet attended ANC were in the first trimester of pregnancy. Over time, the number of pregnant women the CHW identified each month increased, as did the proportion of women who had not yet attended ANC. The median gestational age of pregnant women contacted for the first time by a CHW decreased steadily and significantly over time (from 21/22 to 16 weeks, P-value for test of trend <0.0001). Conclusions: A large-scale CHW intervention was effective in identifying pregnant women in their homes early in pregnancy and before they had attended ANC. The intervention thus fulfills some of the conditions that are necessary for CHW to improve timely ANC uptake and early HIV testing and PMTCT enrollment in pregnancy. PMID:25436818

  3. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  4. Quantitative SERS by hot spot normalization - surface enhanced Rayleigh band intensity as an alternative evaluation parameter for SERS substrate performance.

    PubMed

    Wei, Haoran; McCarthy, Alexis; Song, Junyeob; Zhou, Wei; Vikesland, Peter J

    2017-09-19

    The performance of surface-enhanced Raman spectroscopy (SERS) substrates is typically evaluated by calculating an enhancement factor (EF). However, it is challenging to accurately calculate EF values since the calculation often requires the use of model analytes and requires assumptions about the number of analyte molecules within the laser excitation volume. Furthermore, the measured EF values are target analyte dependent and thus it is challenging to compare substrates with EF values obtained using different analytes. In this study, we propose an alternative evaluation parameter for SERS substrate performance that is based on the intensity of the surface plasmon enhanced Rayleigh band (IRayleigh) that originates from the amplified spontaneous emission (ASE) of the laser. Compared to the EF, IRayleigh reflects the enhancing capability of the substrate itself, is easy to measure without the use of any analytes, and is universally applicable for the comparison of SERS substrates. Six SERS substrates with different states (solid, suspended in liquid, and hydrogel), different plasmonic nanoparticle identities (silver and gold), as well as different nanoparticle sizes and shapes were used to support our hypothesis. The results show that there are excellent correlations between the measured SERS intensities and IRayleigh as well as between the SERS homogeneity and the variation of IRayleigh acquired with the six SERS substrates. These results suggest that IRayleigh can be used as an evaluation parameter for both SERS substrate efficiency and reproducibility.

  5. The design and characterization of a testing platform for quantitative evaluation of tread performance on multiple biological substrates.

    PubMed

    Sliker, Levin J; Rentschler, Mark E

    2012-09-01

    In this study, an experimental platform is developed to quantitatively measure the performance of robotic wheel treads in a dynamic environment. The platform imposes a dynamic driving condition for a single robot wheel, where the wheel is rotated on a translating substrate, thereby inducing slip. The normal force of the wheel can be adjusted mechanically, while the rotational velocity of the wheel and the translational velocity of the substrate can be controlled using an open-loop control system. Wheel slip and translational speed can be varied autonomously while wheel traction force is measured using a load cell. The testing platform is characterized by testing one micropatterned polydimethylsiloxane (PDMS) tread on three substrates (dry synthetic tissue, hydrated synthetic tissue, and excised porcine small bowel tissue), at three normal forces (0.10, 0.20, and 0.30 N), 13 slip ratios (-0.30 to 0.30 in increments of 0.05), and three translational speeds (2, 3, and 6 mm/s). Additionally, two wheels (micropatterned and smooth PDMS) are tested on beef liver at the same three normal forces and translational speeds for a tread comparison. An analysis of variance revealed that the platform can detect statistically significant differences between means when observing normal forces, translational speeds, slip ratios, treads, and substrates. The variance due to within (platform error, P = 1) and between trials (human error, P = 0.152) is minimal when compared to the normal force (P = 0.036), translational speed ( P = 0.059), slip ratio (P = 0), tread (P = 0.004), and substrate variances ( P = 0). In conclusion, this precision testing platform can be used to determine wheel tread performance differences on the three substrates and for each of the studied parameters. Future use of the platform could lead to an optimized micropattern-based mobility system, under given operating conditions, for implementation on a robotic capsule endoscope.

  6. A longitudinal evaluation of performance of automated BCR-ABL1 quantitation using cartridge-based detection system

    PubMed Central

    Enjeti, Anoop; Granter, Neil; Ashraf, Asma; Fletcher, Linda; Branford, Susan; Rowlings, Philip; Dooley, Susan

    2015-01-01

    SummaryAn automated cartridge-based detection system (GeneXpert; Cepheid) is being widely adopted in low throughput laboratories for monitoring BCR-ABL1 transcript in chronic myelogenous leukaemia. This Australian study evaluated the longitudinal performance specific characteristics of the automated system. The automated cartridge-based system was compared prospectively with the manual qRT-PCR-based reference method at SA Pathology, Adelaide, over a period of 2.5 years. A conversion factor determination was followed by four re-validations. Peripheral blood samples (n = 129) with international scale (IS) values within detectable range were selected for assessment. The mean bias, proportion of results within specified fold difference (2-, 3- and 5-fold), the concordance rate of major molecular remission (MMR) and concordance across a range of IS values on paired samples were evaluated. The initial conversion factor for the automated system was determined as 0.43. Except for the second re-validation, where a negative bias of 1.9-fold was detected, all other biases fell within desirable limits. A cartridge-specific conversion factor and efficiency value was introduced and the conversion factor was confirmed to be stable in subsequent re-validation cycles. Concordance with the reference method/laboratory at >0.1–≤10 IS was 78.2% and at ≤0.001 was 80%, compared to 86.8% in the >0.01–≤0.1 IS range. The overall and MMR concordance were 85.7% and 94% respectively, for samples that fell within ± 5-fold of the reference laboratory value over the entire period of study. Conversion factor and performance specific characteristics for the automated system were longitudinally stable in the clinically relevant range, following introduction by the manufacturer of lot specific efficiency values. PMID:26166664

  7. Blind Analysis of Fortified Pesticide Residues in Carrot Extracts using GC-MS to Evaluate Qualitative and Quantitative Performance

    USDA-ARS?s Scientific Manuscript database

    Unlike quantitative analysis, the quality of the qualitative results in the analysis of pesticide residues in food are generally ignored in practice. Instead, chemists tend to rely on advanced mass spectrometric techniques and general subjective guidelines or fixed acceptability criteria when makin...

  8. Quantitative performance of advanced resolution recovery strategies on SPECT images: evaluation with use of digital phantom models.

    PubMed

    Onishi, Hideo; Motomura, Nobutoku; Fujino, Koichi; Natsume, Takahiro; Haramoto, Yasuhiro

    2013-01-01

    Several resolution recovery (RR) methods have been developed. This study was aimed to validate the following performance of the advanced RR methods: Evolution, Astonish, Flash3D, and 3D-OSEM. We compared the advanced RR method with filtered back projection (FBP) and standard order-subset expectation maximization (OSEM) using resolution (RES), cylinder/sphere (CYS), and myocardial (MYD) digital phantoms. The RES phantom was placed in three spheres. Sixteen spheres (hot and cold) were then placed in a concentric configuration (diameter: 96-9.6 mm) inside the CYS phantom. The MYD phantom was created by computer simulation with the use of an electron γ-shower 4 (EGS4) and it included two left ventricular defects in the myocardium. The performance was evaluated at source-to-detector distances (R-distance) of 166, 200, and 250 mm with reconstruction parameters (product of subset and iteration: SI) with use of the resolution recovery factor, count recovery, normalized mean square error (NMSE), and %CV. According to increased SI updates, the value of the FWHM decreased, and the effect was more obvious as the R-distance increased. The spatial resolution of the advanced RR method was 20 % better than that of FBP and OSEM. The resolution recovery ratio was 80 %, and the count recovery was maintained only in objects with a diameter of >30 mm in the advanced RR method. The NMSE and %CV was 50 and 30 % improved over FBP and OSEM, respectively. The advanced RR method caused overestimation due to Gibbs's phenomenon in the marginal region when the diameter of the sphere was 16-28.8 mm.

  9. Comparison of two real-time quantitative polymerase chain reaction strategies for minimal residual disease evaluation in lymphoproliferative disorders: correlation between immunoglobulin gene mutation load and real-time quantitative polymerase chain reaction performance.

    PubMed

    Della Starza, Irene; Cavalli, Marzia; Del Giudice, Ilaria; Barbero, Daniela; Mantoan, Barbara; Genuardi, Elisa; Urbano, Marina; Mannu, Claudia; Gazzola, Anna; Ciabatti, Elena; Guarini, Anna; Foà, Robin; Galimberti, Sara; Piccaluga, Pierpaolo; Gaidano, Gianluca; Ladetto, Marco; Monitillo, Luigia

    2014-09-01

    We compared two strategies for minimal residual disease evaluation of B-cell lymphoproliferative disorders characterized by a variable immunoglobulin heavy chain (IGH) genes mutation load. Twenty-five samples from chronic lymphocytic leukaemia (n = 18) or mantle cell lymphoma (n = 7) patients were analyzed. Based on IGH variable region genes, 22/25 samples carried > 2% mutations, 20/25 > 5%. In the IGH joining region genes, 23/25 samples carried > 2% mutations, 18/25 > 5%. Real-time quantitative polymerase chain reaction was performed on IGH genes using two strategies: method A utilizes two patient-specific primers, whereas method B employs one patient-specific and one germline primer, with different positions on the variable, diversity and joining regions. Twenty-three samples (92%) resulted evaluable using method A, only six (24%) by method B. Method B poor performance was specifically evident among mutated IGH variable/joining region cases, although no specific mutation load above, which the real-time quantitative polymerase chain reaction failed was found. The molecular strategies for minimal residual disease evaluation should be adapted to the B-cell receptor features of the disease investigated.

  10. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  11. Reliability and validity of a quantitative color scale to evaluate masticatory performance using color-changeable chewing gum.

    PubMed

    Hama, Yohei; Kanazawa, Manabu; Minakuchi, Shunsuke; Uchida, Tatsuro; Sasaki, Yoshiyuki

    2014-03-19

    In the present study, we developed a novel color scale for visual assessment, conforming to theoretical color changes of a gum, to evaluate masticatoryperformance; moreover, we investigated the reliability and validity of this evaluation method using the color scale. Ten participants (aged 26.30 years) with natural dentition chewed the gum at several chewing strokes. Changes in color were measured using a colorimeter, and then, linearregression expressions that represented changes in gum color were derived. The color scale was developed using these regression expressions. Thirty-two chewed gums were evaluated using colorimeter and were assessed three times using the color scale by six dentists aged 25.27 (mean, 25.8) years, six preclinical dental students aged 21.23 (mean, 22.2) years, and six elderly individuals aged 68.84 (mean, 74.0) years. The intrarater and interrater reliability of evaluations was assessed using intraclass correlation coefficients. Validity of the method compared with a colorimeter was assessed using Spearman's rank correlation coefficient. All intraclass correlation coefficients were > 0.90, and Spearman's rank-correlation coefficients were > 0.95 in all groups. These results indicated that the evaluation method of the color-changeable chewing gum using the newly developed color scale is reliable and valid.

  12. An approach for relating the results of quantitative nondestructive evaluation to intrinsic properties of high-performance materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    One of the most difficult problems the manufacturing community has faced during recent years has been to accurately assess the physical state of anisotropic high-performance materials by nondestructive means. In order to advance the design of ultrasonic nondestructive testing systems, a more fundamental understanding of how ultrasonic waves travel and interact within the anisotropic material is needed. The relationship between the ultrasonic and engineering parameters needs to be explored to understand their mutual dependence. One common denominator is provided by the elastic constants. The preparation of specific graphite/epoxy samples to be used in the experimental investigation of the anisotropic properties (through the measurement of the elastic stiffness constants) is discussed. Accurate measurements of these constants will depend upon knowledge of refraction effects as well as the direction of group velocity propagation. The continuing effort for the development of improved visualization techniques for physical parameters is discussed. Group velocity images are presented and discussed. In order to fully understand the relationship between the ultrasonic and the common engineering parameters, the physical interpretation of the linear elastic coefficients (the quantities that relate applied stresses to resulting strains) are discussed. This discussion builds a more intuitional understanding of how the ultrasonic parameters are related to the traditional engineering parameters.

  13. Quantitative evaluation of signal integrity for magnetocardiography.

    PubMed

    Zhang, Shulin; Wang, Yongliang; Wang, Huiwu; Jiang, Shiqin; Xie, Xiaoming

    2009-08-07

    Magnetocardiography (MCG) is a non-invasive diagnostic tool used to investigate the activity of the heart. For applications in an unshielded environment, in order to extract the very weak signal of interest from the much higher background noise, dedicated hardware configuration and sophisticated signal processing techniques have been developed during the last decades. Being powerful in noise rejection, the signal processing may introduce signal distortions, if not properly designed and applied. However, there is a lack of an effective tool to quantitatively evaluate the signal integrity for MCG at present. In this paper, we have introduced a very simple method by using a small coil driven by a human ECG signal to generate a simulated MCG signal. Three key performance indexes were proposed, which are correlation in time domain, relative heights of different peaks and correlation in frequency domain, to evaluate the MCG system performance quantitatively. This evaluation method was applied to a synthetic gradiometer consisting of a second-order axial gradiometer and three orthogonal reference magnetometers. The evaluation turned out to be very effective in optimizing the parameters for signal processing. In addition, the method can serve as a useful tool for hardware improvement.

  14. Quantitative evaluation of yeast's requirement for glycerol formation in very high ethanol performance fed-batch process

    PubMed Central

    2010-01-01

    Background Glycerol is the major by-product accounting for up to 5% of the carbon in Saccharomyces cerevisiae ethanolic fermentation. Decreasing glycerol formation may redirect part of the carbon toward ethanol production. However, abolishment of glycerol formation strongly affects yeast's robustness towards different types of stress occurring in an industrial process. In order to assess whether glycerol production can be reduced to a certain extent without jeopardising growth and stress tolerance, the yeast's capacity to synthesize glycerol was adjusted by fine-tuning the activity of the rate-controlling enzyme glycerol 3-phosphate dehydrogenase (GPDH). Two engineered strains whose specific GPDH activity was significantly reduced by two different degrees were comprehensively characterized in a previously developed Very High Ethanol Performance (VHEP) fed-batch process. Results The prototrophic strain CEN.PK113-7D was chosen for decreasing glycerol formation capacity. The fine-tuned reduction of specific GPDH activity was achieved by replacing the native GPD1 promoter in the yeast genome by previously generated well-characterized TEF promoter mutant versions in a gpd2Δ background. Two TEF promoter mutant versions were selected for this study, resulting in a residual GPDH activity of 55 and 6%, respectively. The corresponding strains were referred to here as TEFmut7 and TEFmut2. The genetic modifications were accompanied to a strong reduction in glycerol yield on glucose; the level of reduction compared to the wild-type was 61% in TEFmut7 and 88% in TEFmut2. The overall ethanol production yield on glucose was improved from 0.43 g g-1 in the wild type to 0.44 g g-1 measured in TEFmut7 and 0.45 g g-1 in TEFmut2. Although maximal growth rate in the engineered strains was reduced by 20 and 30%, for TEFmut7 and TEFmut2 respectively, strains' ethanol stress robustness was hardly affected; i.e. values for final ethanol concentration (117 ± 4 g L-1), growth

  15. Development of a combined in vitro cell culture--quantitative PCR assay for evaluating the disinfection performance of pulsed light for treating the waterborne enteroparasite Giardia lamblia.

    PubMed

    Garvey, Mary; Stocca, Alessia; Rowan, Neil

    2014-09-01

    Giardia lamblia is a flagellated protozoan parasite that is recognised as a frequent cause of water-borne disease in humans and animals. We report for the first time on the use of a combined in vitro HCT-8 cell culture-quantitative PCR assay for evaluating the efficacy of using pulsed UV light for treating G. lamblia parasites. Findings showed that current methods that are limited to using vital stains before and after cyst excystation are not appropriate for monitoring or evaluating cyst destruction post PUV-treatments. Use of the human ileocecal HCT-8 cell line was superior to that of the human colon Caco-2 cell line for in vitro culture and determining PUV sensitivity of treated cysts. G. lamblia cysts were also shown to be more resistant to PUV irradiation compared to treating similar numbers of Cryptosporidium parvum oocysts. These observations also show that the use of this HCT-8 cell culture assay may replace use of animal models for determining disinfection performances of PUV for treating both C. parvum and G. lamblia. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Quantitative and chemical fingerprint analysis for the quality evaluation of Isatis indigotica based on ultra-performance liquid chromatography with photodiode array detector combined with chemometric methods.

    PubMed

    Shi, Yan-Hong; Xie, Zhi-Yong; Wang, Rui; Huang, Shan-Jun; Li, Yi-Ming; Wang, Zheng-Tao

    2012-01-01

    A simple and reliable method of ultra-performance liquid chromatography with photodiode array detector (UPLC-PDA) was developed to control the quality of Radix Isatidis (dried root of Isatis indigotica) for chemical fingerprint analysis and quantitative analysis of eight bioactive constituents, including R,S-goitrin, progoitrin, epiprogoitrin, gluconapin, adenosine, uridine, guanosine, and hypoxanthine. In quantitative analysis, the eight components showed good regression (R > 0.9997) within test ranges, and the recovery method ranged from 99.5% to 103.0%. The UPLC fingerprints of the Radix Isatidis samples were compared by performing chemometric procedures, including similarity analysis, hierarchical clustering analysis, and principal component analysis. The chemometric procedures classified Radix Isatidis and its finished products such that all samples could be successfully grouped according to crude herbs, prepared slices, and adulterant Baphicacanthis cusiae Rhizoma et Radix. The combination of quantitative and chromatographic fingerprint analysis can be used for the quality assessment of Radix Isatidis and its finished products.

  17. Performance of phalangeal quantitative ultrasound parameters in the evaluation of reduced bone mineral density assessed by DX in patients with 21 hydroxylase deficiency.

    PubMed

    Gonçalves, Ezequiel M; Sewaybricker, Leticia E; Baptista, Fatima; Silva, Analiza M; Carvalho, Wellington R G; Santos, Allan O; de Mello, Maricilda P; Lemos-Marini, Sofia H V; Guerra, Gil

    2014-07-01

    The purpose of this study was to verify the performance of quantitative ultrasound (QUS) parameters of proximal phalanges in the evaluation of reduced bone mineral density (BMD) in patients with congenital adrenal hyperplasia due to 21-hydroxylase deficiency (21 OHD). Seventy patients with 21 OHD (41 females and 29 males), aged between 6-27 y were assessed. The QUS measurements, amplitude-dependent speed of sound (AD-SoS), bone transmission time (BTT), and ultrasound bone profile index (UBPI) were obtained using the BMD Sonic device (IGEA, Carpi, Italy) on the last four proximal phalanges in the non-dominant hand. BMD was determined by dual energy X-ray (DXA) across the total body and lumbar spine (LS). Total body and LS BMD were positively correlated to UBPI, BTT and AD-SoS (correlation coefficients ranged from 0.59-0.72, p < 0.001). In contrast, when comparing patients with normal and low (Z-score < -2) BMD, no differences were found in the QUS parameters. Furthermore, UBPI, BTT and AD-SoS measurements were not effective for diagnosing patients with reduced BMD by receiver operator characteristic curve parameters. Although the AD-SoS, BTT and UBPI showed significant correlations with the data obtained by DXA, they were not effective for diagnosing reduced bone mass in patients with 21 OHD. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  18. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  19. Quantitative framework for prospective motion correction evaluation

    PubMed Central

    Pannetier, Nicolas; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2014-01-01

    Purpose Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. Method A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Results Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Conclusion Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. PMID:25761550

  20. Evaluating quantitative research designs: Part 1.

    PubMed

    Haughey, B P

    1994-10-01

    This article has provided an overview of the three major types of quantitative designs commonly used in nursing research, as well as some criteria for evaluating the designs of published research. The next column will include additional criteria for critiquing quantitative research designs.

  1. C-arm cone beam CT guidance of sinus and skull base surgery: quantitative surgical performance evaluation and development of a novel high-fidelity phantom

    NASA Astrophysics Data System (ADS)

    Vescan, A. D.; Chan, H.; Daly, M. J.; Witterick, I.; Irish, J. C.; Siewerdsen, J. H.

    2009-02-01

    Surgical simulation has become a critical component of surgical practice and training in the era of high-precision image-guided surgery. While the ability to simulate surgery of the paranasal sinuses and skull base has been conventionally limited to 3D digital simulation or cadaveric dissection, we have developed novel methods employing rapid prototyping technology and 3D printing to create high-fidelity models from real patient images (CT or MR). Such advances allow creation of patient-specific models for preparation, simulation, and training before embarking on the actual surgery. A major challenge included the development of novel material formulations compatible with the rapid prototyping process while presenting anatomically realistic flexibility, cut-ability, drilling purchase, and density (CT number). Initial studies have yielded realistic models of the paranasal sinuses and skull base for simulation and training in image-guided surgery. The process of model development and material selection is reviewed along with the application of the phantoms in studies of high-precision surgery guided by C-arm cone-beam CT (CBCT). Surgical performance is quantitatively evaluated under CBCT guidance, with the high-fidelity phantoms providing an excellent test-bed for reproducible studies across a broad spectrum of challenging surgical tasks. Future work will broaden the atlas of models to include normal anatomical variations as well as a broad spectrum of benign and malignant disease. The role of high-fidelity models produced by rapid prototyping is discussed in the context of patient-specific case simulation, novel technology development (specifically CBCT guidance), and training of future generations of sinus and skull base surgeons.

  2. Influence of sulphur-fumigation on the quality of white ginseng: a quantitative evaluation of major ginsenosides by high performance liquid chromatography.

    PubMed

    Jin, Xin; Zhu, Ling-Ying; Shen, Hong; Xu, Jun; Li, Song-Lin; Jia, Xiao-Bin; Cai, Hao; Cai, Bao-Chang; Yan, Ru

    2012-12-01

    White ginseng was reported to be sulphur-fumigated during post-harvest handling. In the present study, the influence of sulphur-fumigation on the quality of white ginseng and its decoction were quantitatively evaluated through simultaneous quantification of 14 major ginsenosides by a validated high performance liquid chromatography. Poroshell 120 EC-C18 (100mm×3.0mm, 2.7μm) column was chosen for the separation of the major ginsenosides, which were eluted with gradient water and acetonitrile as mobile phase. The analytes were monitored by UV at 203nm. The method was validated in terms of linearity, sensitivity, precision, accuracy and stability. The sulphur-fumigated and non-fumigated white ginseng samples, as well as their respective decoctions, were comparatively analysed with the newly-validated method. It was found that the contents of nine ginsenosides detected in raw materials decreased by about 3-85%, respectively, and the total content of the nine ginsenosides detected in raw materials, decreased by almost 54% after sulphur-fumigation. On the other hand, the contents of 10 ginsenosides detected in decoctions of sulphur-fumigated white ginseng were decreased by about 33-83%, respectively, and the total content of ginsenosides was decreased by up to 64% when compared with that of non-fumigated white ginseng. In addition, ginsenoside Rh(2) and Rg(5) could be detected in the decoctions of sulphur-fumigated white ginseng but not in that of non-fumigated white ginseng. It is suggested that sulphur-fumigation can significantly influence not only the contents of original ginsenosides, but also the decocting-induced chemical transformation of ginsenosides in white ginseng.

  3. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them.

  4. Evaluating mandibular cortical index quantitatively.

    PubMed

    Yasar, Fusun; Akgunlu, Faruk

    2008-10-01

    The aim was to assess whether Fractal Dimension and Lacunarity analysis can discriminate patients having different mandibular cortical shape. Panoramic radiographs of 52 patients were evaluated for mandibular cortical index. Weighted Kappa between the observations were varying between 0.718-0.805. These radiographs were scanned and converted to binary images. Fractal Dimension and Lacunarity were calculated from the regions where best represents the cortical morphology. It was found that there were statistically significant difference between the Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 1 and Cl 2 (Fractal Dimension P:0.000; Lacunarity P:0.003); and Cl 1 and Cl 3 cortical morphology (Fractal Dimension P:0.008; Lacunarity P:0.001); but there was no statistically significant difference between Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 2 and Cl 3 cortical morphology (Fractal Dimension P:1.000; Lacunarity P:0.758). FD and L can differentiate Cl 1 mandibular cortical shape from both Cl 2 and Cl 3 mandibular cortical shape but cannot differentiate Cl 2 from Cl 3 mandibular cortical shape on panoramic radiographs.

  5. Evaluating Mandibular Cortical Index Quantitatively

    PubMed Central

    Yasar, Fusun; Akgunlu, Faruk

    2008-01-01

    Objectives The aim was to assess whether Fractal Dimension and Lacunarity analysis can discriminate patients having different mandibular cortical shape. Methods Panoramic radiographs of 52 patients were evaluated for mandibular cortical index. Weighted Kappa between the observations were varying between 0.718–0.805. These radiographs were scanned and converted to binary images. Fractal Dimension and Lacunarity were calculated from the regions where best represents the cortical morphology. Results It was found that there were statistically significant difference between the Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 1 and Cl 2 (Fractal Dimension P:0.000; Lacunarity P:0.003); and Cl 1 and Cl 3 cortical morphology (Fractal Dimension P:0.008; Lacunarity P:0.001); but there was no statistically significant difference between Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 2 and Cl 3 cortical morphology (Fractal Dimension P:1.000; Lacunarity P:0.758). Conclusions FD and L can differentiate Cl 1 mandibular cortical shape from both Cl 2 and Cl 3 mandibular cortical shape but cannot differentiate Cl 2 from Cl 3 mandibular cortical shape on panoramic radiographs. PMID:19212535

  6. Apprentice Performance Evaluation.

    ERIC Educational Resources Information Center

    Gast, Clyde W.

    The Granite City (Illinois) Steel apprentices are under a performance evaluation from entry to graduation. Federally approved, the program is guided by joint apprenticeship committees whose monthly meetings include performance evaluation from three information sources: journeymen, supervisors, and instructors. Journeymen's evaluations are made…

  7. Evaluation of chondral repair using quantitative MRI.

    PubMed

    Nieminen, Miika T; Nissi, Mikko J; Mattila, Lauri; Kiviranta, Ilkka

    2012-12-01

    Various quantitative magnetic resonance imaging (qMRI) biomarkers, including but not limited to parametric MRI mapping, semiquantitative evaluation, and morphological assessment, have been successfully applied to assess cartilage repair in both animal and human studies. Through the interaction between interstitial water and constituent macromolecules the compositional and structural properties of cartilage can be evaluated. In this review a comprehensive view of a variety of quantitative techniques, particularly those involving parametric mapping, and their relationship to the properties of cartilage repair is presented. Some techniques, such as T2 relaxation time mapping and delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), are well established, while the full potential of more recently introduced techniques remain to be demonstrated. A combination of several MRI techniques is necessary for a comprehensive characterization of chondral repair. Copyright © 2012 Wiley Periodicals, Inc.

  8. [Quantitative evaluation of the Romberg test].

    PubMed

    Nieschalk, M; Delank, K W; Stoll, W

    1995-08-01

    Simple and economical measuring platforms are available to aid the ENT clinician in examining vestibulospinal disorders. The aim of our study was to quantitatively interpret Romberg test measurements. Calculating the area between the zero line and the curves in sagittal and lateral direction--for the Romberg test with closed and open eyes--enables a quantification of body sway. For the evaluation of data, we developed a triangular diagram which allows a graphic representation and quantifies vestibulospinal reaction at a glance. We assessed a group of 80 persons without any symptoms of peripheral or central vestibular system disturbance. Selected patients complaining of vestibular disorders are helpful in demonstrating the objective and quantitative interpretation of the Romberg test, which is often analysed in a more subjective way.

  9. Influence of processing procedure on the quality of Radix Scrophulariae: a quantitative evaluation of the main compounds obtained by accelerated solvent extraction and high-performance liquid chromatography.

    PubMed

    Cao, Gang; Wu, Xin; Li, Qinglin; Cai, Hao; Cai, Baochang; Zhu, Xuemei

    2015-02-01

    An improved high-performance liquid chromatography with diode array detection combined with accelerated solvent extraction method was used to simultaneously determine six compounds in crude and processed Radix Scrophulariae samples. Accelerated solvent extraction parameters such as extraction solvent, temperature, number of cycles, and analysis procedure were systematically optimized. The results indicated that compared with crude Radix Scrophulariae samples, the processed samples had lower contents of harpagide and harpagoside but higher contents of catalpol, acteoside, angoroside C, and cinnamic acid. The established method was sufficiently rapid and reliable for the global quality evaluation of crude and processed herbal medicines. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Chemical fingerprint and quantitative analysis for the quality evaluation of Vitex negundo seeds by reversed-phase high-performance liquid chromatography coupled with hierarchical clustering analysis.

    PubMed

    Shu, Zhiheng; Li, Xiuqing; Rahman, Khalid; Qin, Luping; Zheng, Chengjian

    2016-01-01

    A simple and efficient method was developed for the chemical fingerprint analysis and simultaneous determination of four phenylnaphthalene-type lignans in Vitex negundo seeds using high-performance liquid chromatography with diode array detection. For fingerprint analysis, 13 V. negundo seed samples were collected from different regions in China, and the fingerprint chromatograms were matched by the computer-aided Similarity Evaluation System for Chromatographic Fingerprint of TCM (Version 2004A). A total of 21 common peaks found in all the chromatograms were used for evaluating the similarity between these samples. Additionally, simultaneous quantification of four major bioactive ingredients was conducted to assess the quality of V. negundo seeds. Our results indicated that the contents of four lignans in V. negundo seeds varied remarkably in herbal samples collected from different regions. Moreover, the hierarchical clustering analysis grouped these 13 samples into three categories, which was consistent with the chemotypes of those chromatograms. The method developed in this study provides a substantial foundation for the establishment of reasonable quality control standards for V. negundo seeds.

  11. Evaluating steam trap performance

    SciTech Connect

    Fuller, N.Y.

    1985-08-08

    This paper presents a method for evaluating the performance level of steam traps by preparing an economic analysis of several types to determine the equivalent uniform annual cost. A series of tests on steam traps supplied by six manufacturers provided data for determining the relative efficiencies of each unit. The comparison was made using a program developed for the Texas Instruments T1-59 programmable calculator to evaluate overall steam trap economics.

  12. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  13. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  14. Evaluating Performance of Components

    NASA Technical Reports Server (NTRS)

    Katz, Daniel; Tisdale, Edwin; Norton, Charles

    2004-01-01

    Parallel Component Performance Benchmarks is a computer program developed to aid the evaluation of the Common Component Architecture (CCA) - a software architecture, based on a component model, that was conceived to foster high-performance computing, including parallel computing. More specifically, this program compares the performances (principally by measuring computing times) of componentized versus conventional versions of the Parallel Pyramid 2D Adaptive Mesh Refinement library - a software library that is used to generate computational meshes for solving physical problems and that is typical of software libraries in use at NASA s Jet Propulsion Laboratory.

  15. A QUANTITATIVE TECHNIQUE FOR PERFORMING PLASMAPHERESIS

    PubMed Central

    Melnick, Daniel; Cowgill, George R.

    1936-01-01

    1. A special apparatus and technique are described which permit one to conduct plasmapheresis quantitatively. 2. The validity of the methods employed, for determining serum protein concentration and blood volume as prerequisites for the calculation of the amount of blood to be withdrawn, are discussed. PMID:19870575

  16. Performing quantitative MFM measurements on soft magnetic nanostructures.

    PubMed

    Rawlings, Colin; Durkan, Colm

    2012-11-16

    We have extended our previous work (Rawlings et al 2010 Phys. Rev. B 82 085404) on simulating magnetic force microscopy (MFM) images for magnetically soft samples to include an accurate representation of coated MFM tips. We used an array of square 500 nm nanomagnets to evaluate our improved MFM model. A quantitative comparison between model and experiment was performed for lift heights ranging from 20 to 100 nm. No fitting parameters were used in our comparison. For all lift heights the qualitative agreement between model and experiment was significantly improved. At low lift heights, where the magnetic signal was strong, the difference between theory and experiment was less than 30%.

  17. Performing quantitative MFM measurements on soft magnetic nanostructures

    NASA Astrophysics Data System (ADS)

    Rawlings, Colin; Durkan, Colm

    2012-11-01

    We have extended our previous work (Rawlings et al 2010 Phys. Rev. B 82 085404) on simulating magnetic force microscopy (MFM) images for magnetically soft samples to include an accurate representation of coated MFM tips. We used an array of square 500 nm nanomagnets to evaluate our improved MFM model. A quantitative comparison between model and experiment was performed for lift heights ranging from 20 to 100 nm. No fitting parameters were used in our comparison. For all lift heights the qualitative agreement between model and experiment was significantly improved. At low lift heights, where the magnetic signal was strong, the difference between theory and experiment was less than 30%.

  18. Performance Evaluation Process.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product…

  19. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  20. Evacuation performance evaluation tool.

    PubMed

    Farra, Sharon; Miller, Elaine T; Gneuhs, Matthew; Timm, Nathan; Li, Gengxin; Simon, Ashley; Brady, Whittney

    2016-01-01

    Hospitals conduct evacuation exercises to improve performance during emergency events. An essential aspect in this process is the creation of reliable and valid evaluation tools. The objective of this article is to describe the development and implications of a disaster evacuation performance tool that measures one portion of the very complex process of evacuation. Through the application of the Delphi technique and DeVellis's framework, disaster and neonatal experts provided input in developing this performance evaluation tool. Following development, content validity and reliability of this tool were assessed. Large pediatric hospital and medical center in the Midwest. The tool was pilot tested with an administrative, medical, and nursing leadership group and then implemented with a group of 68 healthcare workers during a disaster exercise of a neonatal intensive care unit (NICU). The tool has demonstrated high content validity with a scale validity index of 0.979 and inter-rater reliability G coefficient (0.984, 95% CI: 0.948-0.9952). The Delphi process based on the conceptual framework of DeVellis yielded a psychometrically sound evacuation performance evaluation tool for a NICU.

  1. Assessment beyond Performance: Phenomenography in Educational Evaluation

    ERIC Educational Resources Information Center

    Micari, Marina; Light, Gregory; Calkins, Susanna; Streitwieser, Bernhard

    2007-01-01

    Increasing calls for accountability in education have promoted improvements in quantitative evaluation approaches that measure student performance; however, this has often been to the detriment of qualitative approaches, reducing the richness of educational evaluation as an enterprise. In this article the authors assert that it is not merely…

  2. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  3. A Novel Assessment Tool for Quantitative Evaluation of Science Literature Search Performance: Application to First-Year and Senior Undergraduate Biology Majors

    ERIC Educational Resources Information Center

    Blank, Jason M.; McGaughey, Karen J.; Keeling, Elena L.; Thorp, Kristen L.; Shannon, Conor C.; Scaramozzino, Jeanine M.

    2016-01-01

    Expertise in searching and evaluating scientific literature is a requisite skill of trained scientists and science students, yet information literacy instruction varies greatly among institutions and programs. To ensure that science students acquire information literacy skills, robust methods of assessment are needed. Here, we describe a novel…

  4. A Novel Assessment Tool for Quantitative Evaluation of Science Literature Search Performance: Application to First-Year and Senior Undergraduate Biology Majors

    ERIC Educational Resources Information Center

    Blank, Jason M.; McGaughey, Karen J.; Keeling, Elena L.; Thorp, Kristen L.; Shannon, Conor C.; Scaramozzino, Jeanine M.

    2016-01-01

    Expertise in searching and evaluating scientific literature is a requisite skill of trained scientists and science students, yet information literacy instruction varies greatly among institutions and programs. To ensure that science students acquire information literacy skills, robust methods of assessment are needed. Here, we describe a novel…

  5. A quantitative method to evaluate corrosion products in tissues.

    PubMed

    Cabrini, R L; Olmedo, D G; Guglielmotti, M B

    2003-01-01

    The use of odontological or orthopedic metal implants requires the availability of techniques to estimate tissue response to the corrosion processes. In previous experimental studies we showed the deposition of corrosion products not only locally (Olmedo et al., Implant Dent 2003; 12: 75-80) but also systemically (Olmedo et al., J Mater Sci: Mater in Medic 2002; 13: 793-796) in organs such as liver, spleen and lung. The aim of the present study was to propose a method to quantitatively assess the tissue deposits of the corrosion products of the materials used to manufacture implants. The samples (liver and lung) were embedded in paraffin, and the histological sections were submitted to thickness standardization. The quantitative evaluation of the deposits was performed in an MPM-800 (Carl Zeiss)* microscope. The light microscopy images were digitalized and then analyzed employing the DNA-IBAS-Kontron software that allows for the identification and evaluation of cells loaded with corrosion products (objective 20x). The following end-points were assessed: total field area, number of deposits of corrosion products, partial and total area of the deposits, and the ratio between volume of the deposits and tissue volume. The method proposed serves to quantitatively evaluate, at light microscopy level, the deposition of corrosion products in tissues.

  6. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  7. Quantitative evaluation of chemisorption processes on semiconductors

    NASA Astrophysics Data System (ADS)

    Rothschild, A.; Komem, Y.; Ashkenasy, N.

    2002-12-01

    This article presents a method for numerical computation of the degree of coverage of chemisorbates and the resultant surface band bending as a function of the ambient gas pressure, temperature, and semiconductor doping level. This method enables quantitative evaluation of the effect of chemisorption on the electronic properties of semiconductor surfaces, such as the work function and surface conductivity, which is of great importance for many applications such as solid- state chemical sensors and electro-optical devices. The method is applied for simulating the chemisorption behavior of oxygen on n-type CdS, a process that has been investigated extensively due to its impact on the photoconductive properties of CdS photodetectors. The simulation demonstrates that the chemisorption of adions saturates when the Fermi level becomes aligned with the chemisorption-induced surface states, limiting their coverage to a small fraction of a monolayer. The degree of coverage of chemisorbed adions is proportional to the square root of the doping level, while neutral adsorbates are independent of the doping level. It is shown that the chemisorption of neutral adsorbates behaves according to the well-known Langmuir model, regardless of the existence of charged species on the surface, while charged adions do not obey Langmuir's isotherm. In addition, it is found that in depletive chemisorption processes the resultant surface band bending increases by 2.3kT (where k is the Boltzmann constant and T is the temperature) when the gas pressure increases by one order of magnitude or when the doping level increases by two orders of magnitude.

  8. Functional Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Greenisen, Michael C.; Hayes, Judith C.; Siconolfi, Steven F.; Moore, Alan D.

    1999-01-01

    The Extended Duration Orbiter Medical Project (EDOMP) was established to address specific issues associated with optimizing the ability of crews to complete mission tasks deemed essential to entry, landing, and egress for spaceflights lasting up to 16 days. The main objectives of this functional performance evaluation were to investigate the physiological effects of long-duration spaceflight on skeletal muscle strength and endurance, as well as aerobic capacity and orthostatic function. Long-duration exposure to a microgravity environment may produce physiological alterations that affect crew ability to complete critical tasks such as extravehicular activity (EVA), intravehicular activity (IVA), and nominal or emergency egress. Ultimately, this information will be used to develop and verify countermeasures. The answers to three specific functional performance questions were sought: (1) What are the performance decrements resulting from missions of varying durations? (2) What are the physical requirements for successful entry, landing, and emergency egress from the Shuttle? and (3) What combination of preflight fitness training and in-flight countermeasures will minimize in-flight muscle performance decrements? To answer these questions, the Exercise Countermeasures Project looked at physiological changes associated with muscle degradation as well as orthostatic intolerance. A means of ensuring motor coordination was necessary to maintain proficiency in piloting skills, EVA, and IVA tasks. In addition, it was necessary to maintain musculoskeletal strength and function to meet the rigors associated with moderate altitude bailout and with nominal or emergency egress from the landed Orbiter. Eight investigations, referred to as Detailed Supplementary Objectives (DSOs) 475, 476, 477, 606, 608, 617, 618, and 624, were conducted to study muscle degradation and the effects of exercise on exercise capacity and orthostatic function (Table 3-1). This chapter is divided into

  9. Sample metallization for performance improvement in desorption/ionization of kilodalton molecules: quantitative evaluation, imaging secondary ion MS, and laser ablation.

    PubMed

    Delcorte, A; Bour, J; Aubriet, F; Muller, J-F; Bertrand, P

    2003-12-15

    The metallization procedure, proposed recently for signal improvement in organic secondary ion mass spectrometry (SIMS) (Delcorte, A.; Médard, N.; Bertrand, P. Anal.Chem. 2002, 74, 4955)., has been thoroughly tested for a set of kilodalton molecules bearing various functional groups: Irganox 1010, polystyrene, polyalanine, and copper phthalocyanine. In addition to gold, we evaluate the effect of silver evaporation as a sample treatment prior to static SIMS analysis. Ion yields, damage cross sections, and emission efficiencies are compared for Ag- and Au-metallized molecular films, pristine coatings on silicon, and submonolayers of the same molecules adsorbed on silver and gold. The results are sample-dependent but as an example, the yield enhancement calculated for metallized Irganox films with respect to untreated coatings is larger than 2 orders of magnitude for the quasimolecular ion and a factor of 1-10 for characteristic fragments. Insights into the emission processes of quasimolecular ions from metallized surfaces are deduced from kinetic energy distribution measurements. The advantage of the method for imaging SIMS applications is illustrated by the study of a nonuniform coating of polystyrene oligomers on a 100-microm polypropylene film. The evaporated metal eliminates sample charging and allows us to obtain enhanced quality images of characteristic fragment ions as well as reasonably contrasted chemical mappings for cationized PS oligomers and large PP chain segments. Finally, we report on the benefit of using metal evaporation as a sample preparation procedure for laser ablation mass spectrometry. Our results show that the fingerprint spectra of Au-covered polystyrene, polypropylene, and Irganox films can be readily obtained under 337-nm irradiation, a wavelength for which the absorption of polyolefins is low. This is probably because the gold clusters embedded in the sample surface absorb and transfer the photon energy to the surrounding organic medium.

  10. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  11. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  12. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Algorithm performance evaluation

    NASA Astrophysics Data System (ADS)

    Smith, Richard N.; Greci, Anthony M.; Bradley, Philip A.

    1995-03-01

    Traditionally, the performance of adaptive antenna systems is measured using automated antenna array pattern measuring equipment. This measurement equipment produces a plot of the receive gain of the antenna array as a function of angle. However, communications system users more readily accept and understand bit error rate (BER) as a performance measure. The work reported on here was conducted to characterize adaptive antenna receiver performance in terms of overall communications system performance using BER as a performance measure. The adaptive antenna system selected for this work featured a linear array, least mean square (LMS) adaptive algorithm and a high speed phase shift keyed (PSK) communications modem.

  14. Design, Implementation and Multisite Evaluation of a System Suitability Protocol for the Quantitative Assessment of Instrument Performance in Liquid Chromatography-Multiple Reaction Monitoring-MS (LC-MRM-MS)*

    PubMed Central

    Abbatiello, Susan E.; Mani, D. R.; Schilling, Birgit; MacLean, Brendan; Zimmerman, Lisa J.; Feng, Xingdong; Cusack, Michael P.; Sedransk, Nell; Hall, Steven C.; Addona, Terri; Allen, Simon; Dodder, Nathan G.; Ghosh, Mousumi; Held, Jason M.; Hedrick, Victoria; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S.; Riley, C. Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A.; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H.; Buck, Charles; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel; MacCoss, Michael; Neubert, Thomas A.; Paulovich, Amanda; Regnier, Fred; Skates, Steven J.; Tempst, Paul; Wang, Mu; Carr, Steven A.

    2013-01-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps

  15. Design, implementation and multisite evaluation of a system suitability protocol for the quantitative assessment of instrument performance in liquid chromatography-multiple reaction monitoring-MS (LC-MRM-MS).

    PubMed

    Abbatiello, Susan E; Mani, D R; Schilling, Birgit; Maclean, Brendan; Zimmerman, Lisa J; Feng, Xingdong; Cusack, Michael P; Sedransk, Nell; Hall, Steven C; Addona, Terri; Allen, Simon; Dodder, Nathan G; Ghosh, Mousumi; Held, Jason M; Hedrick, Victoria; Inerowicz, H Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S; Riley, C Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H; Buck, Charles; Fisher, Susan J; Gibson, Bradford W; Liebler, Daniel; Maccoss, Michael; Neubert, Thomas A; Paulovich, Amanda; Regnier, Fred; Skates, Steven J; Tempst, Paul; Wang, Mu; Carr, Steven A

    2013-09-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps

  16. Milankovitch radiation variations: a quantitative evaluation.

    PubMed

    Shaw, D M; Donn, W L

    1968-12-13

    A quantitative determination of changes in the surface temperature caused by variations in insolation calculated by Milankovitch has been made through the use of the thermodynamic model of Adem. Under extreme conditions, mean coolings of 3.1 degrees and 2.7 degrees C, respectively, at latitudes 25 degrees and 65 degrees N are obtained for Milankovitch radiation cycles. At the sensitive latitude 65 degrees N, a mean cooling below the present temperature for each of the times of radiation minimum is only 1.4 degrees C. This result indicates that the Milankovitch effect is rather small to have triggered glacial climates.

  17. Intralaboratory development and evaluation of a high-performance liquid chromatography-fluorescence method for detection and quantitation of aflatoxins M1, B1, B2, G1, and G2 in animal liver.

    PubMed

    Shao, Dahai; Imerman, Paula M; Schrunk, Dwayne E; Ensley, Steve M; Rumbeiha, Wilson K

    2016-11-01

    Aflatoxins are potent mycotoxins with effects that include hepatotoxicity, immunosuppression, and suppression of animal growth and production. The etiologic diagnosis of aflatoxicosis, which is largely based on analysis of contaminated feed matrices, has significant disadvantages given the fact that representative feed samples may not be available and feed-based test methods are not confirmatory of an etiologic diagnosis. A tissue-based analytical method for biomarkers of exposure would be valuable for confirmation of aflatoxicosis. We describe in-house development and evaluation of a high-performance liquid chromatographic method with fluorescence detection and precolumn derivatization for determination of aflatoxins M1, B1, B2, G1, and G2 in animal liver. The method demonstrates good selectivity for the tested aflatoxins in the liver matrix. The overall range was 0.03-0.10 ng/g for limit of detection and 0.09-0.18 ng/g for limit of quantitation. The correlation coefficient (R(2)) of calibration curves was >0.9978 for AFM1, 0.9995 for AFB1, 0.9986 for AFB2, 0.9983 for AFG1, and 0.9980 for AFG2 For fortification levels of 0.2-10 ng/g, repeatability was 10-18% for AFM1, 7-14% for AFB1, 5-14% for AFB2, 6-16% for AFG1, and 10-15% for AFG2 Recovery was 52-57% for AFM1, 54-62% for AFB1, 55-61% for AFB2, 57-67% for AFG1, and 61-65% for AFG2 There was no liver matrix effect found. The method is rugged against minor changes based on the selected factors. The results indicate that the proposed method is suitable for quantitative determination of aflatoxins M1, B1, B2, G1, and G2 in liver. © 2016 The Author(s).

  18. Vender Performance Evaluation.

    ERIC Educational Resources Information Center

    Grant, Joan; Perelmuter, Susan

    1978-01-01

    Vendor selection can mean success or failure of an approval plan; this study evaluates three book vendors by comparing their plans on the bases of speed, bibliographic accuracy, and discounts. (Author/CWM)

  19. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  20. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  1. Instrument performance evaluation

    SciTech Connect

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program.

  2. Quantitative Evaluation of Management Courses: Part 1

    ERIC Educational Resources Information Center

    Cunningham, Cyril

    1973-01-01

    The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)

  3. Quantitative evaluation of Radix Paeoniae Alba sulfur-fumigated with different durations and purchased from herbal markets: simultaneous determination of twelve components belonging to three chemical types by improved high performance liquid chromatography-diode array detector.

    PubMed

    Kong, Ming; Liu, Huan-Huan; Xu, Jun; Wang, Chun-Ru; Lu, Ming; Wang, Xiao-Ning; Li, You-Bin; Li, Song-Lin

    2014-09-01

    In this study, a improved high performance liquid chromatography-diode array detector (HPLC-DAD) method for simultaneous quantification of twelve major components belonging to three chemical types was developed and validated, and was applied to quantitatively compare the quality of Radix Paeoniae Alba (RPA) sulfur-fumigated with different durations and purchased from commercial herbal markets. The contents of paeoniflorin, benzoylpaeoniflorin, oxypaeoniflorin, benzoic acid and paeonol decreased whereas that of paeoniflorin sulfonate increased in RPA with the extending of sulfur-fumigation duration. Different levels of paeoniflorin sulfonate were determined in ten of seventeen commercial RPA samples, indicating that these ten samples may be sulfur-fumigated with different durations. Moreover, the relative standard deviation of the contents of each component was higher in the commercial sulfur-fumigated RPA samples than that in commercial non-fumigated RPA samples, and the percentage of the total average content of monoterpene glycosides in the determined analytes was higher in the decoctions of commercial sulfur-fumigated RPA than that in commercial non-fumigated RPA samples. All these results suggested that the established method was precise, accurate and sensitive enough for the global quality evaluation of sulfur-fumigated RPA, and sulfur-fumigation can not only change the proportions of bioactive components, but also cause the reduction of the quality consistency of both raw materials and aqueous decoctions of RPA. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Quantitative and Qualitative Change in Children's Mental Rotation Performance

    ERIC Educational Resources Information Center

    Geiser, Christian; Lehmann, Wolfgang; Corth, Martin; Eid, Michael

    2008-01-01

    This study investigated quantitative and qualitative changes in mental rotation performance and solution strategies with a focus on sex differences. German children (N = 519) completed the Mental Rotations Test (MRT) in the 5th and 6th grades (interval: one year; age range at time 1: 10-11 years). Boys on average outperformed girls on both…

  5. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  6. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  7. Quantitative versus Qualitative Evaluation: A Tool to Decide Which to Use

    ERIC Educational Resources Information Center

    Dobrovolny, Jackie L.; Fuentes, Stephanie Christine G.

    2008-01-01

    Evaluation is often avoided in human performance technology (HPT), but it is an essential and frequently catalytic activity that adds significant value to projects. Knowing how to approach an evaluation and whether to use qualitative, quantitative, or both methods makes evaluation much easier. In this article, we provide tools to help determine…

  8. Quantitative evaluation of sparfloxacin binding to urological catheter surfaces.

    PubMed

    Kowalczuk, D; Gowin, E; Miazga-Karska, M

    2012-01-01

    Our aim was to apply high-performance liquid chromatography method for quantitative evaluation of the total amount of sparfloxacin (SPA) immobilized on the surface of the antimicrobial urological catheters. The amounts of SPA bound to catheter were determined indirectly on the basis of the differences in SPA concentrations before and after the immobilization process (they have been shown to vary from 0.11 to 5.66 mg/g of catheter). We estimated the immobilization yield, which altered from 14% to 70% depending on the SPA concentration used. As in vitro release studies show, the antibiotic binds to the catheter matrix in two modes: relatively stable covalent bond and weak non-covalent bond. Antibacterial activity of the modified catheter samples with SPA was controlledby using the zone of inhibition test against gram positive and gram negative bacteria.

  9. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  10. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  11. Designer substrate library for quantitative, predictive modeling of reaction performance

    PubMed Central

    Bess, Elizabeth N.; Bischoff, Amanda J.; Sigman, Matthew S.

    2014-01-01

    Assessment of reaction substrate scope is often a qualitative endeavor that provides general indications of substrate sensitivity to a measured reaction outcome. Unfortunately, this field standard typically falls short of enabling the quantitative prediction of new substrates’ performance. The disconnection between a reaction’s development and the quantitative prediction of new substrates’ behavior limits the applicative usefulness of many methodologies. Herein, we present a method by which substrate libraries can be systematically developed to enable quantitative modeling of reaction systems and the prediction of new reaction outcomes. Presented in the context of rhodium-catalyzed asymmetric transfer hydrogenation, these models quantify the molecular features that influence enantioselection and, in so doing, lend mechanistic insight to the modes of asymmetric induction. PMID:25267648

  12. A quantitative evaluation of alcohol withdrawal tremors.

    PubMed

    Aarabi, Parham; Norouzi, Narges; Dear, Taylor; Carver, Sally; Bromberg, Simon; Gray, Sara; Kahan, Mel; Borgundvaag, Bjug

    2015-01-01

    This paper evaluates the relation between Alcohol Withdrawal Syndrome tremors in the left and right hands of patients. By analyzing 122 recordings from 61 patients in emergency departments, we found a weak relationship between the left and right hand tremor frequencies (correlation coefficient of 0.63). We found a much stronger relationship between the expert physician tremor ratings (on CIWA-Ar 0-7 scale) of the two hands, with a correlation coefficient of 0.923. Next, using a smartphone to collect the tremor data and using a previously developed model for obtaining estimated tremor ratings, we also found a strong correlation (correlation coefficient of 0.852) between the estimates of each hand. Finally, we evaluated different methods of combining the data from the two hands for obtaining a single tremor rating estimate, and found that simply averaging the tremor ratings of the two hands results in the lowest tremor estimate error (an RMSE of 0.977). Looking at the frequency dependence of this error, we found that higher frequency tremors had a much lower estimation error (an RMSE of 1.102 for tremors with frequencies in the 3-6Hz range as compared to 0.625 for tremors with frequencies in the 7-10Hz range).

  13. Evaluation of Performance Based Logistics

    DTIC Science & Technology

    2006-08-01

    logistics warehouse provider (see Figure 11 below). NAVICP-P 3PL Warehouse Provider Lockheed Martin Michelin Government Prime Contractor Subcontractor...Evaluation of Performance Based Logistics by Jacques S. Gansler and William Lucyshyn August...REPORT DATE AUG 2006 2. REPORT TYPE 3. DATES COVERED 00-00-2006 to 00-00-2006 4. TITLE AND SUBTITLE Evaluation of Performance Based Logistics

  14. Evaluating and Improving Teacher Performance.

    ERIC Educational Resources Information Center

    Manatt, Richard P.

    This workbook, coordinated with Manatt Teacher Performance Evaluation (TPE) workshops, summarizes large group presentation in sequence with the transparancies used. The first four modules of the workbook deal with the state of the art of evaluating and improving teacher performance; the development of the TPE system, including selection of…

  15. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  16. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  17. Quantitative code accuracy evaluation of ISP33

    SciTech Connect

    Kalli, H.; Miwrrin, A.; Purhonen, H.

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  18. Quantitative evaluation fo cerebrospinal fluid shunt flow

    SciTech Connect

    Chervu, S.; Chervu, L.R.; Vallabhajosyula, B.; Milstein, D.M.; Shapiro, K.M.; Shulman, K.; Blaufox, M.D.

    1984-01-01

    The authors describe a rigorous method for measuring the flow of cerebrospinal fluid (CSF) in shunt circuits implanted for the relief of obstructive hydrocephalus. Clearance of radioactivity for several calibrated flow rates was determined with a Harvard infusion pump by injecting the Rickham reservoir of a Rickham-Holter valve system with 100 ..mu..Ci of Tc-99m as pertechnetate. The elliptical and the cylindrical Holter valves used as adjunct valves with the Rickham reservoir yielded two different regression lines when the clearances were plotted against flow rats. The experimental regression lines were used to determine the in vivo flow rates from clearances calculated after injecting the Rickham reservoirs of the patients. The unique clearance characteristics of the individual shunt systems available requires that calibration curves be derived for an entire system identical to one implanted in the patient being evaluated, rather than just the injected chamber. Excellent correlation between flow rates and the clinical findings supports the reliability of this method of quantification of CSF shunt flow, and the results are fully accepted by neurosurgeons.

  19. Quantitative evaluation of ocean thermal energy conversion (OTEC): executive briefing

    SciTech Connect

    Gritton, E.C.; Pei, R.Y.; Hess, R.W.

    1980-08-01

    Documentation is provided of a briefing summarizing the results of an independent quantitative evaluation of Ocean Thermal Energy Conversion (OTEC) for central station applications. The study concentrated on a central station power plant located in the Gulf of Mexico and delivering power to the mainland United States. The evaluation of OTEC is based on three important issues: resource availability, technical feasibility, and cost.

  20. Relevance of MTF and NPS in quantitative CT: towards developing a predictable model of quantitative performance

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Richard, Samuel; Samei, Ehsan

    2012-03-01

    The quantification of lung nodule volume based on CT images provides valuable information for disease diagnosis and staging. However, the precision of the quantification is protocol, system, and technique dependent and needs to be evaluated for each specific case. To efficiently investigate the quantitative precision and find an optimal operating point, it is important to develop a predictive model based on basic system parameters. In this study, a Fourier-based metric, the estimability index (e') was proposed as such a predictor, and validated across a variety of imaging conditions. To first obtain the ground truth of quantitative precision, an anthropomorphic chest phantom with synthetic spherical nodules were imaged on a 64 slice CT scanner across a range of protocols (five exposure levels and two reconstruction algorithms). The volumes of nodules were quantified from the images using clinical software, with the precision of the quantification calculated for each protocol. To predict the precision, e' was calculated for each protocol based on several Fourier-based figures of merit, which modeled the characteristic of the quantitation task and the imaging condition (resolution, noise, etc.) of a particular protocol. Results showed a strong correlation (R2=0.92) between the measured and predicted precision across all protocols, indicating e' as an effective predictor of the quantitative precision. This study provides a useful framework for quantification-oriented optimization of CT protocols.

  1. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  2. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  3. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  4. Performance Evaluation and Benchmarking of Intelligent Systems

    SciTech Connect

    Madhavan, Raj; Messina, Elena; Tunstel, Edward

    2009-09-01

    To design and develop capable, dependable, and affordable intelligent systems, their performance must be measurable. Scientific methodologies for standardization and benchmarking are crucial for quantitatively evaluating the performance of emerging robotic and intelligent systems technologies. There is currently no accepted standard for quantitatively measuring the performance of these systems against user-defined requirements; and furthermore, there is no consensus on what objective evaluation procedures need to be followed to understand the performance of these systems. The lack of reproducible and repeatable test methods has precluded researchers working towards a common goal from exchanging and communicating results, inter-comparing system performance, and leveraging previous work that could otherwise avoid duplication and expedite technology transfer. Currently, this lack of cohesion in the community hinders progress in many domains, such as manufacturing, service, healthcare, and security. By providing the research community with access to standardized tools, reference data sets, and open source libraries of solutions, researchers and consumers will be able to evaluate the cost and benefits associated with intelligent systems and associated technologies. In this vein, the edited book volume addresses performance evaluation and metrics for intelligent systems, in general, while emphasizing the need and solutions for standardized methods. To the knowledge of the editors, there is not a single book on the market that is solely dedicated to the subject of performance evaluation and benchmarking of intelligent systems. Even books that address this topic do so only marginally or are out of date. The research work presented in this volume fills this void by drawing from the experiences and insights of experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. The book presents

  5. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  6. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, J.R.

    1999-08-17

    Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.

  7. Performance evaluation soil samples utilizing encapsulation technology

    SciTech Connect

    Dahlgran, James R.

    1997-12-01

    Performance evaluation soil samples and method of their preparation are described using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  8. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  9. Colorimetric evaluation of display performance

    NASA Astrophysics Data System (ADS)

    Kosmowski, Bogdan B.

    2001-08-01

    The development of information techniques, using new technologies, physical phenomena and coding schemes, enables new application areas to be benefited form the introduction of displays. The full utilization of the visual perception of a human operator, requires the color coding process to be implemented. The evolution of displays, from achromatic (B&W) and monochromatic, to multicolor and full-color, enhances the possibilities of information coding, creating however a need for the quantitative methods of display parameter assessment. Quantitative assessment of color displays, restricted to photometric measurements of their parameters, is an estimate leading to considerable errors. Therefore, the measurements of a display's color properties have to be based on spectral measurements of the display and its elements. The quantitative assessment of the display system parameters should be made using colorimetric systems like CIE1931, CIE1976 LAB or LUV. In the paper, the constraints on the measurement method selection for the color display evaluation are discussed and the relations between their qualitative assessment and the ergonomic conditions of their application are also presented. The paper presents the examples of using LUV colorimetric system and color difference (Delta) E in the optimization of color liquid crystal displays.

  10. Room for Improvement: Performance Evaluations.

    ERIC Educational Resources Information Center

    Webb, Gisela

    1989-01-01

    Describes a performance management approach to library personnel management that stresses communication, clarification of goals, and reinforcement of new practices and behaviors. Each phase of the evaluation process (preparation, rating, administrative review, appraisal interview, and follow-up) and special evaluations to be used in cases of…

  11. The supervisor's performance appraisal: evaluating the evaluator.

    PubMed

    McConnell, C R

    1993-04-01

    The focus of much performance appraisal in the coming decade or so will likely be on the level of customer satisfaction achieved through performance. Ultimately, evaluating the evaluator--that is, appraising the supervisor--will likely become a matter of assessing how well the supervisor's department meets the needs of its customers. Since meeting the needs of one's customers can well become the strongest determinant of organizational success or failure, it follows that relative success in ensuring these needs are met can become the primary indicator of one's relative success as a supervisor. This has the effect of placing the emphasis on supervisory performance exactly at the point it belongs, right on the bottom-line results of the supervisor's efforts.

  12. Performance comparison between static and dynamic cardiac CT on perfusion quantitation and patient classification tasks

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2015-03-01

    Cardiac CT acquisitions for perfusion assessment can be performed in a dynamic or static mode. In this simulation study, we evaluate the relative classification and quantification performance of these modes for assessing myocardial blood flow (MBF). In the dynamic method, a series of low dose cardiac CT acquisitions yields data on contrast bolus dynamics over time; these data are fit with a model to give a quantitative MBF estimate. In the static method, a single CT acquisition is obtained, and the relative CT numbers in the myocardium are used to infer perfusion states. The static method does not directly yield a quantitative estimate of MBF, but these estimates can be roughly approximated by introducing assumed linear relationships between CT number and MBF, consistent with the ways such images are typically visually interpreted. Data obtained by either method may be used for a variety of clinical tasks, including 1) stratifying patients into differing categories of ischemia and 2) using the quantitative MBF estimate directly to evaluate ischemic disease severity. Through simulations, we evaluate the performance on each of these tasks. The dynamic method has very low bias in MBF estimates, making it particularly suitable for quantitative estimation. At matched radiation dose levels, ROC analysis demonstrated that the static method, with its high bias but generally lower variance, has superior performance in stratifying patients, especially for larger patients.

  13. Quantitative Synthesis: An Actuarial Base for Planning Impact Evaluations.

    ERIC Educational Resources Information Center

    Cordray, David S.; Sonnefeld, L. Joseph

    1985-01-01

    There are numerous micro-level methods decisions associated with planning an impact evaluation. Quantitative synthesis methods can be used to construct an actuarial data base for establishing the likelihood of achieving desired sample sizes, statistical power, and measurement characteristics. (Author/BS)

  14. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  15. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers

  16. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  17. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  18. [Quantitative evaluation of soil hyperspectra denoising with different filters].

    PubMed

    Huang, Ming-Xiang; Wang, Ke; Shi, Zhou; Gong, Jian-Hua; Li, Hong-Yi; Chen, Jie-Liang

    2009-03-01

    The noise distribution of soil hyperspectra measured by ASD FieldSpec Pro FR was described, and then the quantitative evaluation of spectral denoising with six filters was compared. From the interpretation of soil hyperspectra, the continuum removed, first-order differential and high frequency curves, the UV/VNIR (350-1 050 nm) exhibit hardly noise except the coverage of 40 nm in the beginning 350 nm. However, the SWIR (1 000-2 500 nm) shows different noise distribution. Especially, the latter half of SWIR 2(1 800-2 500 nm) showed more noise, and the intersection spectrum of three spectrometers has more noise than the neighbor spectrum. Six filters were chosen for spectral denoising. The smoothing indexes (SI), horizontal feature reservation index (HFRI) and vertical feature reservation index (VFRI) were designed for evaluating the denoising performance of these filters. The comparison of their indexes shows that WD and MA filters are the optimal choice to filter the noise, in terms of balancing the contradiction between the smoothing and feature reservation ability. Furthermore the first-order differential data of 66 denoising soil spectra by 6 filters were respectively used as the input of the same PLSR model to predict the sand content. The different prediction accuracies caused by the different filters show that compared to the feature reservation ability, the filter's smoothing ability is the principal factor to influence the accuracy. The study can benefit the spectral preprocessing and analyzing, and also provide the scientific foundation for the related spectroscopy applications.

  19. Simplified method for video performance evaluation

    NASA Astrophysics Data System (ADS)

    Harshbarger, John H.

    1996-04-01

    Meaningful performance evaluation of video equipment can be complex, requiring specialized equipment in which results must be interpreted by technically trained operators. The alternative to this has been to attempt evaluation by visual inspection of patterns such as the SMPTE RP-133 Medical Imaging Standard. However this involves subjective interpretation and does not indicate the point in a system at which degradation has occurred. The video waveform of such a pattern is complex and not suitable for quantitative analysis. The principal factors which influence quality of a video image on a day-to-day basis are resolution, gray scale and color, if employed. If these qualities are transmitted and displayed without degradation beyond acceptable limits, suitable performance is assured. Performance evaluation by inspection of the image produced on a video display monitor is subject to interpretation; this is resolved by inserting, at the display, the original 'perfect' electronically generated waveform to serve as a reference. Thus the viewer has a specific visual comparison as the basis for performance evaluation. Another valuable feature of the test pattern insert is that a test segment can be placed on recorded images. Thus each image recalled by tape playback or from digital storage will carry an integral means for quality assurance.

  20. γ+ index: A new evaluation parameter for quantitative quality assurance.

    PubMed

    Stathakis, Sotirios; Mavroidis, Panayiotis; Shi, Chengyu; Xu, Jun; Kauweloa, Kevin I; Narayanasamy, Ganesh; Papanikolaou, Niko

    2014-04-01

    The accuracy of dose delivery and the evaluation of differences between calculated and delivered dose distributions, has been studied by several groups. The aim of this investigation is to extend the gamma index by including radiobiological information and to propose a new index that we will here forth refer to as the gamma plus (γ+). Furthermore, to validate the robustness of this new index in performing a quality control analysis of an IMRT treatment plan using pure radiobiological measures such as the biologically effective uniform dose (D) and complication-free tumor control probability (P+). A new quality assurance index, the (γ+), is proposed based on the theoretical concept of gamma index presented by Low et al. (1998). In this study, the dose difference, including the radiobiological dose information (biological effective dose, BED) is used instead of just the physical dose difference when performing the γ+ calculation. An in-house software was developed to compare different dose distributions based on the γ+ concept. A test pattern for a two-dimensional dose comparison was built using the in-house software platform. The γ+ index was tested using planar dose distributions (exported from the treatment planning system) and delivered (film) dose distributions acquired in a solid water phantom using a test pattern and a theoretical clinical case. Furthermore, a lung cancer case for a patient treated with IMRT was also selected for the analysis. The respective planar dose distributions from the treatment plan and the film were compared based on the γ+ index and were evaluated using the radiobiological measures of P+ and D. The results for the test pattern analysis indicate that the γ+ index distributions differ from those of the gamma index since the former considers radiobiological parameters that may affect treatment outcome. For the theoretical clinical case, it is observed that the γ+ index varies for different treatment parameters (e.g. dose per

  1. Skylab experiment performance evaluation manual

    NASA Technical Reports Server (NTRS)

    Meyers, J. E.

    1972-01-01

    A series of preparation analyses used for evaluating the performance of the Skylab corollary experiments under preflight, in-flight, and postflight conditions is given. Experiment continegency plan work-around procedure and malfunction analyses are presented in order to assist in making the experiment operationally succesful.

  2. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease

    PubMed Central

    van Gilst, Merel M.; van Mierlo, Petra; Bloem, Bastiaan R.; Overeem, Sebastiaan

    2015-01-01

    Study Objectives: Many people with Parkinson disease experience “sleep benefit”: temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Design: Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. Results: On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. Conclusions: A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. Citation: van Gilst MM, van Mierlo P, Bloem BR, Overeem S. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease. SLEEP 2015;38(10):1567–1573. PMID:25902811

  3. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  4. Quantitative evaluation of CBM reservoir fracturing quality using logging data

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoyan

    2017-03-01

    This paper presents a method for the quantitative evaluation of fracturing quality of coalbed methane (CBM) reservoirs using logging data, which will help optimize the reservoir fracturing layer. First, to make full use of logging and laboratory analysis data of coal cores, a method to determine the brittleness index of CBM reservoirs is deduced using coal industrial components. Second, this paper briefly introduces methodology to compute the horizontal principal stress difference coefficient of coal seams and the minimum horizontal principal stress difference of coal seams and roof and floor. Third, an evaluation model for the coal structure index is established using logging data, which fully considers the fracturing quality of CBM reservoirs affected by the coal structure. Fourth, the development degree of the coal reservoir is evaluated. The evaluation standard for fracturing quality of CBM reservoirs based on these five evaluation parameters is used for quantitative evaluation. The results show that the combination of methods proposed in this paper are effective. The results are consistent with the fracturing dynamic drainage. The coal seam with large brittleness index, large stress difference between the coal seam and roof and floor, small stress difference coefficient and high coal structure index has a strong fracturing quality.

  5. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  6. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  7. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  8. High-performance quantitative robust switching control for optical telescopes

    NASA Astrophysics Data System (ADS)

    Lounsbury, William P.; Garcia-Sanz, Mario

    2014-07-01

    This paper introduces an innovative robust and nonlinear control design methodology for high-performance servosystems in optical telescopes. The dynamics of optical telescopes typically vary according to azimuth and altitude angles, temperature, friction, speed and acceleration, leading to nonlinearities and plant parameter uncertainty. The methodology proposed in this paper combines robust Quantitative Feedback Theory (QFT) techniques with nonlinear switching strategies that achieve simultaneously the best characteristics of a set of very active (fast) robust QFT controllers and very stable (slow) robust QFT controllers. A general dynamic model and a variety of specifications from several different commercially available amateur Newtonian telescopes are used for the controller design as well as the simulation and validation. It is also proven that the nonlinear/switching controller is stable for any switching strategy and switching velocity, according to described frequency conditions based on common quadratic Lyapunov functions (CQLF) and the circle criterion.

  9. Quantitative imaging to evaluate malignant potential of IPMNs

    PubMed Central

    Hanania, Alexander N.; Bantis, Leonidas E.; Feng, Ziding; Wang, Huamin; Tamm, Eric P.; Katz, Matthew H.; Maitra, Anirban; Koay, Eugene J.

    2016-01-01

    Objective To investigate using quantitative imaging to assess the malignant potential of intraductal papillary mucinous neoplasms (IPMNs) in the pancreas. Background Pancreatic cysts are identified in over 2% of the population and a subset of these, including intraductal papillary mucinous neoplasms (IPMNs), represent pre-malignant lesions. Unfortunately, clinicians cannot accurately predict which of these lesions are likely to progress to pancreatic ductal adenocarcinoma (PDAC). Methods We investigated 360 imaging features within the domains of intensity, texture and shape using pancreatic protocol CT images in 53 patients diagnosed with IPMN (34 “high-grade” [HG] and 19 “low-grade” [LG]) who subsequently underwent surgical resection. We evaluated the performance of these features as well as the Fukuoka criteria for pancreatic cyst resection. Results In our cohort, the Fukuoka criteria had a false positive rate of 36%. We identified 14 imaging biomarkers within Gray-Level Co-Occurrence Matrix (GLCM) that predicted histopathological grade within cyst contours. The most predictive marker differentiated LG and HG lesions with an area under the curve (AUC) of .82 at a sensitivity of 85% and specificity of 68%. Using a cross-validated design, the best logistic regression yielded an AUC of 0.96 (σ = .05) at a sensitivity of 97% and specificity of 88%. Based on the principal component analysis, HG IPMNs demonstrated a pattern of separation from LG IPMNs. Conclusions HG IPMNs appear to have distinct imaging properties. Further validation of these findings may address a major clinical need in this population by identifying those most likely to benefit from surgical resection. PMID:27588410

  10. Quantitative evaluation of hybridization and the impact on biodiversity conservation.

    PubMed

    van Wyk, Anna M; Dalton, Desiré L; Hoban, Sean; Bruford, Michael W; Russo, Isa-Rita M; Birss, Coral; Grobler, Paul; van Vuuren, Bettine Janse; Kotzé, Antoinette

    2017-01-01

    Anthropogenic hybridization is an increasing conservation threat worldwide. In South Africa, recent hybridization is threatening numerous ungulate taxa. For example, the genetic integrity of the near-threatened bontebok (Damaliscus pygargus pygargus) is threatened by hybridization with the more common blesbok (D. p. phillipsi). Identifying nonadmixed parental and admixed individuals is challenging based on the morphological traits alone; however, molecular analyses may allow for accurate detection. Once hybrids are identified, population simulation software may assist in determining the optimal conservation management strategy, although quantitative evaluation of hybrid management is rarely performed. In this study, our objectives were to describe species-wide and localized rates of hybridization in nearly 3,000 individuals based on 12 microsatellite loci, quantify the accuracy of hybrid assignment software (STRUCTURE and NEWHYBRIDS), and determine an optimal threshold of bontebok ancestry for management purposes. According to multiple methods, we identified 2,051 bontebok, 657 hybrids, and 29 blesbok. More than two-thirds of locations contained at least some hybrid individuals, with populations varying in the degree of introgression. HYBRIDLAB was used to simulate four generations of coexistence between bontebok and blesbok, and to optimize a threshold of ancestry, where most hybrids will be detected and removed, and the fewest nonadmixed bontebok individuals misclassified as hybrids. Overall, a threshold Q-value (admixture coefficient) of 0.90 would remove 94% of hybrid animals, while a threshold of 0.95 would remove 98% of hybrid animals but also 8% of nonadmixed bontebok. To this end, a threshold of 0.90 was identified as optimal and has since been implemented in formal policy by a provincial nature conservation agency. Due to widespread hybridization, effective conservation plans should be established and enforced to conserve native populations that are

  11. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis.

  12. Evaluation of absolute peptide quantitation strategies using selected reaction monitoring.

    PubMed

    Campbell, James; Rezai, Taha; Prakash, Amol; Krastins, Bryan; Dayon, Loïc; Ward, Malcolm; Robinson, Sarah; Lopez, Mary

    2011-03-01

    The use of internal peptide standards in selected reaction monitoring experiments enables absolute quantitation. Here, we describe three approaches addressing calibration of peptide concentrations in complex matrices and assess their performance in terms of trueness and precision. The simplest approach described is single reference point quantitation where a heavy peptide is spiked into test samples and the endogenous analyte quantified relative to the heavy peptide internal standard. We refer to the second approach as normal curve quantitation. Here, a constant amount of heavy peptide and a varying amount of light peptide are spiked into matrix to construct a calibration curve. This accounts for matrix effects but due to the presence of endogenous analyte, it is usually not possible to determine the lower LOQ. We refer to the third method as reverse curve quantitation. Here, a constant amount of light peptide and a varying amount of heavy peptide are spiked into matrix to construct a calibration curve. Because there is no contribution to the heavy peptide signal from endogenous analyte, it is possible to measure the equivalent of a blank sample and determine LOQ. These approaches are applied to human plasma samples and used to assay peptides of a set of apolipoproteins. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  14. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  15. Quantitative projections of a quality measure: Performance of a complex task

    NASA Astrophysics Data System (ADS)

    Christensen, K.; Kleppe, Gisle; Vold, Martin; Frette, Vidar

    2014-12-01

    Complex data series that arise during interaction between humans (operators) and advanced technology in a controlled and realistic setting have been explored. The purpose is to obtain quantitative measures that reflect quality in task performance: on a ship simulator, nine crews have solved the same exercise, and detailed maneuvering histories have been logged. There are many degrees of freedom, some of them connected to the fact that the vessels may be freely moved in any direction. To compare maneuvering histories, several measures were used: the time needed to reach the position of operation, the integrated angle between the hull direction and the direction of motion, and the extent of movement when the vessel is to be manually kept in a fixed position. These measures are expected to reflect quality in performance. We have also obtained expert quality evaluations of the crews. The quantitative measures and the expert evaluations, taken together, allow a ranking of crew performance. However, except for time and integrated angle, there is no correlation between the individual measures. This may indicate that complex situations with social and man-machine interactions need complex measures of quality in task performance. In general terms, we have established a context-dependent and flexible framework with quantitative measures in contact with a social-science concept that is hard to define. This approach may be useful for other (qualitative) concepts in social science that contain important information on the society.

  16. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  17. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  18. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    PubMed

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  19. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  20. The Quantitative Science of Evaluating Imaging Evidence.

    PubMed

    Genders, Tessa S S; Ferket, Bart S; Hunink, M G Myriam

    2017-03-01

    Cardiovascular diagnostic imaging tests are increasingly used in everyday clinical practice, but are often imperfect, just like any other diagnostic test. The performance of a cardiovascular diagnostic imaging test is usually expressed in terms of sensitivity and specificity compared with the reference standard (gold standard) for diagnosing the disease. However, evidence-based application of a diagnostic test also requires knowledge about the pre-test probability of disease, the benefit of making a correct diagnosis, the harm caused by false-positive imaging test results, and potential adverse effects of performing the test itself. To assist in clinical decision making regarding appropriate use of cardiovascular diagnostic imaging tests, we reviewed quantitative concepts related to diagnostic performance (e.g., sensitivity, specificity, predictive values, likelihood ratios), as well as possible biases and solutions in diagnostic performance studies, Bayesian principles, and the threshold approach to decision making. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  1. Evaluation of breast lesions by contrast enhanced ultrasound: qualitative and quantitative analysis.

    PubMed

    Wan, Caifeng; Du, Jing; Fang, Hua; Li, Fenghua; Wang, Lin

    2012-04-01

    To evaluate and compare the diagnostic performance of qualitative, quantitative and combined analysis for characterization of breast lesions in contrast enhanced ultrasound (CEUS), with histological results used as the reference standard. Ninety-one patients with 91 breast lesions BI-RADS 3-5 at US or mammography underwent CEUS. All lesions underwent qualitative and quantitative enhancement evaluation. Receiver operating characteristic (ROC) curve analysis was performed to evaluate the diagnostic performance of different analytical method for discrimination between benign and malignant breast lesions. Histopathologic analysis of the 91 lesions revealed 44 benign and 47 malignant. For qualitative analysis, benign and malignant lesions differ significantly in enhancement patterns (p<0.05). Malignant lesions more often showed heterogeneous and centripetal enhancement, whereas benign lesions mainly showed homogeneous and centrifugal enhancement. The detectable rate of peripheral radial or penetrating vessels was significantly higher in malignant lesions than in benign ones (p<0.001). For quantitative analysis, malignant lesions showed significantly higher (p=0.031) and faster enhancement (p=0.025) than benign ones, and its time to peak was significantly shorter (p=0.002). The areas under the ROC curve for qualitative, quantitative and combined analysis were 0.910 (A(z1)), 0.768 (A(z2)) and 0.926(A(z3)) respectively. The values of A(z1) and A(z3) were significantly higher than that for A(z2) (p=0.024 and p=0.008, respectively). But there was no significant difference between the values of A(z1) and A(z3) (p=0.625). The diagnostic performance of qualitative and combined analysis was significantly higher than that for quantitative analysis. Although quantitative analysis has the potential to differentiate benign from malignant lesions, it has not yet improved the final diagnostic accuracy. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  3. [Quantitative evaluation of uveoscleral outflow for optimization of glaucoma treatment].

    PubMed

    Stolyarov, G M; Tikhomirova, N V; Tikhomirov, I V

    2016-01-01

    The leading role in glaucoma treatment is now played by prostaglandin analogues (PGAs), whose point of application is the uveoscleral outflow of aqueous humor. Quantitative evaluation of the latter is, however, a problem yet unsolved. To assess the clinical applicability of a new method for quantitative evaluation of the uveoscleral outflow in human eyes, which is meant to help with optimization of glaucoma therapy. Patients with early (n=33) and advanced (n=30) primary open-angle glaucoma (POAG) were enrolled. Besides the routine ophthalmic examination, all patients had their uveoscleral outflow quantified with our method. Basing on these findings, we have analyzed the effect of different hypotensive eye drops, namely, betaxolol 0.5% (selective beta-1-blocker), brinzolamid 1% (carbonic anhydrase inhibitor), travoprost 0.004% (prostaglandin analogue) and travoprost 0.004%/timolol 0.5% fixed combination (TTFC; prostaglandin analogue plus non-selective beta-blocker). In early POAG, the uveoscleral outflow facility (Cfu) without treatment was 0.06±0.06, after betaxolol 0.5% as well as brinzolamid 1% use - 0.05±0.03, while after travoprost 0.004% and FCTT use - 0.10±0.06 and 0.08±0.05 correspondingly. In advanced POAG, Cfu was 0.04±0.03 without treatment, 0.06±0.04 - after betaxolol 0.5% or brinzolamid 1% use, 0.1±0.05 - after travoprost 0.004% use, and 0.1±0.04 - after FCTT use. Quantitative evaluation of the uveoscleral outflow with the new method that has not only been justified, but also clinically tested, provides an opportunity to optimize POAG treatment.

  4. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  5. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease.

    PubMed

    van Gilst, Merel M; van Mierlo, Petra; Bloem, Bastiaan R; Overeem, Sebastiaan

    2015-10-01

    Many people with Parkinson disease experience "sleep benefit": temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. © 2015 Associated Professional Sleep Societies, LLC.

  6. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  7. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches.

  8. An anthropomorphic phantom for quantitative evaluation of breast MRI

    PubMed Central

    Freed, Melanie; de Zwart, Jacco A.; Loud, Jennifer T.; El Khouli, Riham H.; Myers, Kyle J.; Greene, Mark H.; Duyn, Jeff H.; Badano, Aldo

    2011-01-01

    Purpose: In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. Methods: The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Results: Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom’s adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. Conclusions: The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for

  9. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of

  10. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  11. Review of progress in quantitative nondestructive evaluation. Vol. 3B

    SciTech Connect

    Thompson, D.O.; Chimenti, D.E.

    1984-01-01

    This two-book volume constitutes the Proceedings of the Tenth Annual Review of Progress in Quantitative Nondestructive Evaluation held in California in 1983. Topics considered include nondestructive evaluation (NDE) reliability, ultrasonics (probability of detection, scattering, sizing, transducers, signal processing, imaging and reconstruction), eddy currents (probability of detection, modeling, sizing, probes), acoustic emission, thermal wave imaging, optical techniques, new techniques (e.g., maximum entropy reconstruction, near-surface inspection of flaws using bulk ultrasonic waves, inversion and reconstruction), composite materials, material properties, acoustoelasticity, residual stress, and new NDE systems (e.g., retirement-for-cause procedures for gas turbine engine components, pulsed eddy current flaw detection and characterization, an ultrasonic inspection protocol for IN100 jet engine materials, electromagnetic on-line monitoring of rotating turbine-generator components). Basic research and early engineering applications are emphasized.

  12. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  13. The Nuclear Renaissance - Implications on Quantitative Nondestructive Evaluations

    SciTech Connect

    Matzie, Regis A.

    2007-03-21

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  14. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  15. Quantitative evaluation of bone density using the Hounsfield index.

    PubMed

    Shapurian, Tannaz; Damoulis, Petros D; Reiser, Gary M; Griffin, Terrence J; Rand, William M

    2006-01-01

    The primary aims of this retrospective study were to: (1) evaluate bone quality in different segments of the edentulous jaw and correlate it with demographic data and (2) establish a quantitative and objective assessment of bone quality based on the Hounsfield scale. One hundred one randomly selected computerized tomographic (CT) scans were used for the analysis. Edentulous segments ranging from 10 to 30 mm were selected for evaluation, and the findings were analyzed and correlated to demographics. Implant recipient sites were evaluated visually for bone classification by 2 independent examiners. The same sites were subsequently evaluated digitally using the Hounsfield scale, and the results were correlated with the visual classification. The 4 quadrants of the mouth displayed Hounsfield unit (HU) values ranging from -240 HU to 1,159 HU. The highest unit/mean density value (559 +/- 208 HU) was found in the anterior mandible, followed by 517 +/- 177 HU for the anterior maxilla, 333 +/- 199 HU for the posterior maxilla, and 321 +/- 132 HU for the posterior mandible. There was no association between the Hounsfield value and density and age or gender. When subjective bone quality was correlated to Hounsfield index findings, only the relationship between HU and type 4 bone was found to be significant. Knowledge of the Hounsfield value as a quantitative measurement of bone density can be helpful as a diagnostic tool. It can provide the implant surgeon with an objective assessment of bone density, which could result in modification of surgical techniques or extended healing time, especially in situations where poor bone quality is suspected.

  16. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  17. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  18. SEASAT SAR performance evaluation study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The performance of the SEASAT synthetic aperture radar (SAR) sensor was evaluated using data processed by the MDA digital processor. Two particular aspects are considered the location accuracy of image data, and the calibration of the measured backscatter amplitude of a set of corner reflectors. The image location accuracy was assessed by selecting identifiable targets in several scenes, converting their image location to UTM coordinates, and comparing the results to map sheets. The error standard deviation is measured to be approximately 30 meters. The amplitude was calibrated by measuring the responses of the Goldstone corner reflector array and comparing the results to theoretical values. A linear regression of the measured against theoretical values results in a slope of 0.954 with a correlation coefficient of 0.970.

  19. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  20. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  1. Formative Evaluation in the Performance Context.

    ERIC Educational Resources Information Center

    Dick, Walter; King, Debby

    1994-01-01

    Reviews the traditional formative evaluation model used by instructional designers; summarizes Kirkpatrick's model of evaluation; proposes the integration of part of Kirkpatrick's model with traditional formative evaluation; and discusses performance-context formative evaluation. (three references) (LRW)

  2. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  3. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR

    PubMed Central

    Abt, Melissa A.; Grek, Christina L.; Ghatnekar, Gautam S.; Yeh, Elizabeth S.

    2016-01-01

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death 1. Common sites of metastatic spread include lung, lymph node, brain, and bone 2. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level 3–6. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level 7, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue. PMID:26862835

  4. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR.

    PubMed

    Abt, Melissa A; Grek, Christina L; Ghatnekar, Gautam S; Yeh, Elizabeth S

    2016-01-29

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death. Common sites of metastatic spread include lung, lymph node, brain, and bone. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue.

  5. Exploring quantitative methods for evaluation of lip function.

    PubMed

    Sjögreen, L; Lohmander, A; Kiliaridis, S

    2011-06-01

    The objective was to explore quantitative methods for the measurement of lip mobility and lip force and to relate these to qualitative assessments of lip function. Fifty healthy adults (mean age 45 years) and 23 adults with diagnoses affecting the facial muscles (mean age 37 years) participated in the study. Diagnoses were Möbius syndrome (n=5), Facioscapulohumeral muscular dystrophy (n=6) and Myotonic dystrophy type 1 (n=12). A system for computerised 3D analysis of lip mobility and a lip force meter were tested, and the results were related to results from qualitative assessments of lip mobility, speech (articulation), eating ability and saliva control. Facial expressions studied were open mouth smile and lip pucker. Normative data and cut-off values for adults on lip mobility and lip force were proposed, and the diagnostic value of these thresholds was tested. The proposed cut-off values could identify all inviduals with moderate or severe impairment of lip mobility but not always the milder cases. There were significant correlations between the results from quantitative measurements and qualitative assessments. The examined instruments for measuring lip function were found to be reliable with an acceptable measuring error. The combination of quantitative and qualitative ways to evaluate lip function made it possible to show the strong relation between lip contraction, lip force, eating ability and saliva control. The same combination of assessments can be used in the future to study if oral motor exercises aimed at improving lip mobility and strength could have a positive effect on lip function. © 2010 Blackwell Publishing Ltd.

  6. Verifying performance characteristics of quantitative analytical systems: calibration verification, linearity, and analytical measurement range.

    PubMed

    Killeen, Anthony A; Long, Tom; Souers, Rhona; Styer, Patricia; Ventura, Christina B; Klee, George G

    2014-09-01

    Both the regulations in the Clinical Laboratory Improvement Amendments of 1988 (CLIA) and the checklists of the College of American Pathologists (CAP) Laboratory Accreditation Program require clinical laboratories to verify performance characteristics of quantitative test systems. Laboratories must verify performance claims when introducing an unmodified, US Food and Drug Administration-cleared or approved test system, and they must comply with requirements for periodic calibration and calibration verification for existing test systems. They must also periodically verify the analytical measurement range of many quantitative test systems. To provide definitions for many of the terms used in these regulations, to describe a set of basic analyses that laboratories may adapt to demonstrate compliance with both CLIA and the CAP Laboratory Accreditation Program checklists for performing calibration verification and for verifying the analytical measurement range of test systems, to review some of the recommended procedures for establishing performance goals, and to provide data illustrating the performance goals used in some of the CAP's calibration verification and linearity surveys. The CAP's calibration verification and linearity survey programs, the CLIA regulations, the Laboratory Accreditation Program requirements, and published literature were used to meet these objectives. Calibration verification and linearity and analytical measurement range verification should be performed using suitable materials with assessment of results using well-defined evaluation protocols. We describe the CAP's calibration verification and linearity programs that may be used for these purposes.

  7. Quantitative and morphometric evaluation of the angiogenic effects of leptin.

    PubMed

    Talavera-Adame, Dodanim; Xiong, Yizhi; Zhao, Tong; Arias, Ana E; Sierra-Honigmann, M Rocio; Farkas, Daniel L

    2008-01-01

    Angiogenesis is a dynamic process that requires an interaction of pro-and antiangiogenic factors. It is known that the cytokine leptin stimulates endothelial cell growth and angiogenesis, but further quantitative analysis is necessary to understand leptin angiogenic effects. The quail chorioallantoic membrane (CAM) assay has been used to study angiogenesis in vivo by focusing on morphometric parameters that quantify vascular complexity and density. We quantify the angiogenic activity of leptin using the CAM assay by digital morphometry and a computer-assisted image analysis to evaluate more precisely vessel length, diameter, branching, and tortuousity. CAM images are obtained from ex ovo cultures of E8-E9 quail embryos. MATLAB and custom software are used for our analysis. The effects of leptin, vascular endothelial growth factor-165 (VEGF(165)), and their corresponding neutralizing antibodies are compared. Our results show that CAM treated with leptin and VEGF(165) has a significant increase in vascular complexity and density. A corresponding decrease is observed using neutralizing antibodies. Notably, leptin induced more significant changes than VEGF in vessel length and tortuousity. Conversely, VEGF induced a greater increase in vessel branching than leptin. These results underscore the importance of using multiparametric quantitative methods to assess several aspects of angiogenesis and enable us to understand the proangiogenic effects of leptin.

  8. New method for quantitative evaluation of esophageal sensibility.

    PubMed

    López-Merino, V; Benages, A; Molina, R; Marcos-Buscheck, C; Tomás-Ridocci, M; Mora, F; Moreno-Osset, E; Mínguez, M

    1986-06-01

    A method for quantitating esophagus sensibility by an electric stimulation test is described. Square stimulus waveform at different voltages and durations were transmitted to the esophagus, three series of electric stimuli being used in successive durations (0.5, 1, 2, 4, 8 and 16 ms); in each series the voltage discharge was increased progressively from 0 mV, until the subject noted the first sensation. This procedure was carried out at all esophageal levels. The following parameters were analyzed: sensitive threshold along the esophagus; the relation of threshold sensibility (mV) duration of stimulus (ms), and reobase and cronaxia for each esophageal level. At all esophageal levels, the sensitive threshold was regular and coherent; in the middle esophagus a zone was found having higher sensitive threshold than the proximal and distal esophageal zones. The relationship between sensitive threshold and inverse of the stimulus duration indicated that esophageal sensibility follows the basic law of excitation of WEISS, at least with this type of stimulus, reobase and cronaxia being representative of the sensibility threshold along the esophagus. Quantitative esophageal sensibility, therefore is concluded to be particularly suited to evaluation by electric stimulation.

  9. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  10. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor performance as required in FAR 36.604. Normally, the performance report must be prepared by the...

  11. Evaluation of a quantitative plasma PCR plate assay for detecting cytomegalovirus infection in marrow transplant recipients.

    PubMed Central

    Gallez-Hawkins, G M; Tegtmeier, B R; ter Veer, A; Niland, J C; Forman, S J; Zaia, J A

    1997-01-01

    A plasma PCR test, using a nonradioactive PCR plate assay, was evaluated for detection of human cytomegalovirus reactivation. This assay was compared to Southern blotting and found to perform well. As a noncompetitive method of quantitation, it was similar to a competitive method for detecting the number of genome copies per milliliter of plasma in marrow transplant recipients. This is a technically simplified assay with potential for adaptation to automation. PMID:9041438

  12. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    NASA Astrophysics Data System (ADS)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  13. Shadow photogrammetric apparatus for the quantitative evaluation of corneal buttons.

    PubMed

    Denham, D; Mandelbaum, S; Parel, J M; Holland, S; Pflugfelder, S; Parel, J M

    1989-11-01

    We have developed a technique for the accurate, quantitative, geometric evaluation of trephined and punched corneal buttons. A magnified shadow of the frontal and edge views of a corneal button mounted on the rotary stage of a modified optical comparator is projected onto the screen of the comparator and photographed. This process takes approximately three minutes. The diameters and edge profile at any meridian photographed can subsequently be analyzed from the film. The precision in measuring the diameters of well cut corneal buttons is +/- 23 microns, and in measuring the angle of the edge profile is +/- 1 degree. Statistical analysis of inter observer variability indicated excellent reproducibility of measurements. Shadow photogrammetry offers a standardized, accurate, and reproducible method for analysis of corneal trephination.

  14. Quantitative evaluation of the transplanted lin(-) hematopoietic cell migration kinetics.

    PubMed

    Kašėta, Vytautas; Vaitkuvienė, Aida; Liubavičiūtė, Aušra; Maciulevičienė, Rūta; Stirkė, Arūnas; Biziulevičienė, Genė

    2016-02-01

    Stem cells take part in organogenesis, cell maturation and injury repair. The migration is necessary for each of these functions to occur. The aim of this study was to investigate the kinetics of transplanted hematopoietic lin(-) cell population (which consists mainly of the stem and progenitor cells) in BALB/c mouse contact hypersensitivity model and quantify the migration to the site of inflammation in the affected foot and other healthy organs. Quantitative analysis was carried out with the real-time polymerase chain reaction method. Spleen, kidney, bone marrow, lung, liver, damaged and healthy foot tissue samples at different time points were collected for analysis. The quantitative data normalization was performed according to the comparative quantification method. The analysis of foot samples shows the significant migration of transplanted cells to the recipient mice affected foot. The quantity was more than 1000 times higher, as compared with that of the untreated foot. Due to the inflammation, the number of donor origin cells migrating to the lungs, liver, spleen and bone marrow was found to be decreased. Our data shows that transplanted cells selectively migrated into the inflammation areas of the foot edema. Also, the inflammation caused a secondary migration in ectopic spleen of hematopoietic stem cell niches and re-homing from the spleen to the bone marrow took place.

  15. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  16. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  17. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  18. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested.

  19. Evaluation of a Virucidal Quantitative Carrier Test for Surface Disinfectants

    PubMed Central

    Rabenau, Holger F.; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  20. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    USDA-ARS?s Scientific Manuscript database

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  1. A potential quantitative method for assessing individual tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  2. Improving Student Retention and Performance in Quantitative Courses Using Clickers

    ERIC Educational Resources Information Center

    Liu, Wallace C.; Stengel, Donald N.

    2011-01-01

    Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…

  3. Improving Library Performance: Quantitative Approaches to Library Planning.

    ERIC Educational Resources Information Center

    Webster, Duane E.

    The use of analytical models and quantitative methods for both short- and long-range problem solving offer library managers an excellent opportunity to improve and rationalize decision-making for strategic and organizational planning. The first step is to identify the problems confronting the library and understand its current capabilities.…

  4. Quantitative evaluation of magnetic immunoassay with remanence measurement

    NASA Astrophysics Data System (ADS)

    Enpuku, K.; Soejima, K.; Nishimoto, T.; Kuma, H.; Hamasaki, N.; Tsukamoto, A.; Saitoh, K.; Kandori, A.

    2006-05-01

    Magnetic immunoassays utilizing magnetic markers and a high -Tc SQUID have been performed. The marker was designed so as to generate remanence, and its remanence field was measured with the SQUID. The SQUID system was developed so as to measure 12 samples in one measurement sequence. We first conducted a detection of antigen called human IgE using IgE standard solution, and showed the detection of IgE down to 2 attomol. The binding process between IgE and the marker could be semi-quantitatively explained with the Langmuir-type adsorption model. We also measured IgE in human serums, and demonstrated the usefulness of the present method for practical diagnosis.

  5. Ground truth and benchmarks for performance evaluation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Ayako; Shneier, Michael; Hong, Tsai Hong; Chang, Tommy; Scrapper, Christopher; Cheok, Geraldine S.

    2003-09-01

    Progress in algorithm development and transfer of results to practical applications such as military robotics requires the setup of standard tasks, of standard qualitative and quantitative measurements for performance evaluation and validation. Although the evaluation and validation of algorithms have been discussed for over a decade, the research community still faces a lack of well-defined and standardized methodology. The range of fundamental problems include a lack of quantifiable measures of performance, a lack of data from state-of-the-art sensors in calibrated real-world environments, and a lack of facilities for conducting realistic experiments. In this research, we propose three methods for creating ground truth databases and benchmarks using multiple sensors. The databases and benchmarks will provide researchers with high quality data from suites of sensors operating in complex environments representing real problems of great relevance to the development of autonomous driving systems. At NIST, we have prototyped a High Mobility Multi-purpose Wheeled Vehicle (HMMWV) system with a suite of sensors including a Riegl ladar, GDRS ladar, stereo CCD, several color cameras, Global Position System (GPS), Inertial Navigation System (INS), pan/tilt encoders, and odometry . All sensors are calibrated with respect to each other in space and time. This allows a database of features and terrain elevation to be built. Ground truth for each sensor can then be extracted from the database. The main goal of this research is to provide ground truth databases for researchers and engineers to evaluate algorithms for effectiveness, efficiency, reliability, and robustness, thus advancing the development of algorithms.

  6. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  7. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  8. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective

  9. Evaluating GC/MS Performance

    SciTech Connect

    Alcaraz, A; Dougan, A

    2006-11-26

    and Water Check': By selecting View - Diagnostics/Vacuum Control - Vacuum - Air and Water Check. A Yes/No dialogue box will appear; select No (use current values). It is very important to select No! Otherwise the tune values are drastically altered. The software program will generate a water/air report similar to figure 3. Evaluating the GC/MS system with a performance standard: This procedure should allow the analyst to verify that the chromatographic column and associated components are working adequately to separate the various classes of chemical compounds (e.g., hydrocarbons, alcohols, fatty acids, aromatics, etc.). Use the same GC/MS conditions used to collect the system background and solvent check (part 1 of this document). Figure 5 is an example of a commercial GC/MS column test mixture used to evaluate GC/MS prior to analysis.

  10. Qualitative and quantitative evaluation of enamel after various stripping methods.

    PubMed

    Arman, Ayca; Cehreli, S Burcak; Ozel, Emre; Arhun, Neslihan; Cetinşahin, Alev; Soyman, Mubin

    2006-08-01

    In this study, we investigated ultramorphology, surface roughness, and microhardness of permanent and deciduous tooth enamel after various stripping methods. One hundred twenty deciduous and permanent teeth (n = 60 each) were used. Qualitative (scanning electron microscopy) and quantitative (surface roughness and microhardness tests) experiments were carried out in the following experimental groups: group 1, stripping disk; group 2, diamond-coated metal strip; group 3, stripping disk and Sof-Lex discs (3M-ESPE, Seefeld, Germany); group 4, diamond-coated metal strip and Sof-Lex discs; group 5 (chemical stripping), 37% orthophosphoric acid in conjunction with diamond-coated metal strip; group 6 (control), no stripping. Surface roughness values (Ra) for permanent and deciduous enamel were evaluated with Welch analysis of variance (ANOVA) and Tamhane tests, and Kruskal-Wallis and Mann-Whitney tests, respectively. Microhardness values were evaluated statistically with Kruskal-Wallis, 1-way ANOVA, and Duncan tests. Deciduous and permanent teeth showed similar results in terms of surface roughness and surface morphology. Groups 3 and 4 had the smoothest deciduous and permanent enamel surfaces, whereas chemical stripping (group 5) produced the roughest surfaces in both enamel types. Stripping did not lead to a significant change in the microhardness of permanent enamel. All stripping methods significantly roughened the enamel surfaces. Polishing the stripped surface with Sof-Lex discs decreased the roughness.

  11. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  12. [Quantitative determination of niphensamide by high performance liquid chromatography (HPLC)].

    PubMed

    Long, C; Chen, S; Shi, T

    1998-01-01

    An HPLC method for the quantitative determination of Niphensamide in pesticide powder was developed. Column:Micropak-CH 5 microns (300 mm x 4.0 mm i.d.), mobile phase: CH3OH-H2O(1:1), detector: UV 254 nm, flow rate: 0.7 mL/min, column temperature: 25 degrees C. Under the above conditions, Niphensamide and other components were separated from each other. The method is simple, rapid, sensitive and accurate.

  13. Quantitative evaluation of activation state in functional brain imaging.

    PubMed

    Hu, Zhenghui; Ni, Pengyu; Liu, Cong; Zhao, Xiaohu; Liu, Huafeng; Shi, Pengcheng

    2012-10-01

    Neuronal activity can evoke the hemodynamic change that gives rise to the observed functional magnetic resonance imaging (fMRI) signal. These increases are also regulated by the resting blood volume fraction (V (0)) associated with regional vasculature. The activation locus detected by means of the change in the blood-oxygen-level-dependent (BOLD) signal intensity thereby may deviate from the actual active site due to varied vascular density in the cortex. Furthermore, conventional detection techniques evaluate the statistical significance of the hemodynamic observations. In this sense, the significance level relies not only upon the intensity of the BOLD signal change, but also upon the spatially inhomogeneous fMRI noise distribution that complicates the expression of the results. In this paper, we propose a quantitative strategy for the calibration of activation states to address these challenging problems. The quantitative assessment is based on the estimated neuronal efficacy parameter [Formula: see text] of the hemodynamic model in a voxel-by-voxel way. It is partly immune to the inhomogeneous fMRI noise by virtue of the strength of the optimization strategy. Moreover, it is easy to incorporate regional vascular information into the activation detection procedure. By combining MR angiography images, this approach can remove large vessel contamination in fMRI signals, and provide more accurate functional localization than classical statistical techniques for clinical applications. It is also helpful to investigate the nonlinear nature of the coupling between synaptic activity and the evoked BOLD response. The proposed method might be considered as a potentially useful complement to existing statistical approaches.

  14. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations for...

  15. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Performance evaluations. 304.4... ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the management standards, financial accountability and program performance of each District Organization within three...

  16. Image performance evaluation of a 3D surgical imaging platform

    NASA Astrophysics Data System (ADS)

    Petrov, Ivailo E.; Nikolov, Hristo N.; Holdsworth, David W.; Drangova, Maria

    2011-03-01

    The O-arm (Medtronic Inc.) is a multi-dimensional surgical imaging platform. The purpose of this study was to perform a quantitative evaluation of the imaging performance of the O-arm in an effort to understand its potential for future nonorthopedic applications. Performance of the reconstructed 3D images was evaluated, using a custom-built phantom, in terms of resolution, linearity, uniformity and geometrical accuracy. Both the standard (SD, 13 s) and high definition (HD, 26 s) modes were evaluated, with the imaging parameters set to image the head (120 kVp, 100 mAs and 150 mAs, respectively). For quantitative noise characterization, the images were converted to Hounsfield units (HU) off-line. Measurement of the modulation transfer function revealed a limiting resolution (at 10% level) of 1.0 mm-1 in the axial dimension. Image noise varied between 15 and 19 HU for the HD and SD modes, respectively. Image intensities varied linearly over the measured range, up to 1300 HU. Geometric accuracy was maintained in all three dimensions over the field of view. The present study has evaluated the performance characteristics of the O-arm, and demonstrates feasibility for use in interventional applications and quantitative imaging tasks outside those currently targeted by the manufacturer. Further improvements to the reconstruction algorithms may further enhance performance for lower-contrast applications.

  17. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies.

  18. Quantitative Evaluation of Pain with Pain Index Extracted from Electroencephalogram

    PubMed Central

    An, Jian-Xiong; Wang, Yong; Cope, Doris K; Williams, John P

    2017-01-01

    Background: The current pain assessment methods are strongly subjective and easily affected by outside influences, and there is an urgent need to develop a reliable objective and quantitative pain-monitoring indicator. The aim of this study was to evaluate the feasibility of using Pain index (Pi) to assess pain symptoms in pain patients. Methods: Subjects were enrolled from patients seeking treatment at Pain Medicine Center of China Medical University Aviation General Hospital from October 2015 to December 2016, such as postherpetic neuralgia, spinal cord injury, femoral head necrosis, lumbar disc herniation, trigeminal neuralgia, complex regional pain syndrome, perineal pain, phantom limb pain, etc., (pain group, n = 111), as well as healthy volunteers without subjective pain (control group, n = 100). The subjective pain symptoms in pain patients were evaluated by Pi and visual analogue scale/numerical rating scales (VAS/NRS), respectively, and the relationship between them was analyzed using single factor correlation analysis and multiple factor regression analysis. Results: Pi levels in the pain group were significantly higher than those of the control group (t = 6.273, P < 0.001), the correlation analysis of Pi and VAS/NRS score in the pain group showed that the Pearson correlation coefficient was 0.797 (P < 0.001); After adjusted for types of pain, pain sites, medication, gender, and age, Pi was found to be independently correlated to VAS/NRS score (P < 0.001). Conclusions: Pi significantly correlates with VAS/NRS score, might be used to evaluate the subjective pain symptoms in patients and has good research and application value as an objective pain assessment tool. PMID:28776544

  19. Evaluation of cardiac valvular disease with MR imaging: qualitative and quantitative techniques.

    PubMed

    Glockner, James F; Johnston, Donald L; McGee, Kiaran P

    2003-01-01

    Magnetic resonance (MR) imaging is almost never performed as the initial imaging test in cardiac valvular disease; that role is dominated by echocardiography. Nevertheless, MR imaging has much to offer in selected patients. Quantitative information regarding the severity of regurgitant or stenotic lesions can be obtained by using a combination of cine gradient-echo or steady-state free precession and cine phase-contrast sequences. In addition to providing measurements of peak velocity and flow, MR imaging is the standard of reference for evaluation of ventricular function, which can be a critical factor in determining when surgical intervention is indicated. Improvements in cardiac MR imaging technology have been particularly striking in the past few years, and these developments can easily be applied to the examination of cardiac valves. The authors briefly describe the pathophysiology of valvular disease, discuss standard MR techniques for qualitative and quantitative evaluation of valvular lesions, and illustrate these concepts with several case studies.

  20. Evaluation of the Quantitative Prediction of a Trend Reversal on the Japanese Stock Market in 1999

    NASA Astrophysics Data System (ADS)

    Johansen, Anders; Sornette, Didier

    In January 1999, the authors published a quantitative prediction that the Nikkei index should recover from its 14-year low in January 1999 and reach ~20 500 a year later. The purpose of the present paper is to evaluate the performance of this specific prediction as well as the underlying model: the forecast, performed at a time when the Nikkei was at its lowest (as we can now judge in hindsight), has correctly captured the change of trend as well as the quantitative evolution of the Nikkei index since its inception. As the change of trend from sluggish to recovery was estimated quite unlikely by many observers at that time, a Bayesian analysis shows that a skeptical (resp. neutral) Bayesian sees prior belief in our model amplified into a posterior belief 19 times larger (resp. reach the 95% level).

  1. Quantitative evaluation of solar wind time-shifting methods

    NASA Astrophysics Data System (ADS)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  2. How Successful Is Performance Evaluation?

    ERIC Educational Resources Information Center

    Gray, Frank

    We should no longer be wondering if we should have performance appraisal, rather we should be researching the elements necessary for it to be successfully implemented and to ensure that we receive maximum benefits for improved learning for our students. Performance appraisal can be defined as "a positive, systematic, individualized due…

  3. A Proposed RTN Officer Performance Evaluation System

    DTIC Science & Technology

    1989-12-01

    studiod at the Naval Postpraduate School and practical theories relating to personnel management and performance evaluation. 4 The research method includes...various systems are discussed as the researcher perceives them. The fact that there is probably no agreed upon, fool-proof method of evaluating an...Performance Evaluation System. The research methodology Includes the following three componen: (1) a study of pertinent performance evaluation

  4. Tophaceous gout: quantitative evaluation by direct physical measurement.

    PubMed

    Schumacher, H Ralph; Becker, Michael A; Palo, William A; Streit, Janet; MacDonald, Patricia A; Joseph-Ridge, Nancy

    2005-12-01

    The absence of accepted standardized methods for monitoring tophaceous gout limits the ability to track tophus progression or regression. This multicenter study assessed intra- and interrater reproducibility of a simple and direct physical measurement. The quantitative evaluation was the area (mm2) of each measurable tophus and was determined independently by 2 raters on 2 occasions within 10 days. Intra- and interrater reproducibilities were determined by calculating mean differences and average percentage differences (APD) in measurements of areas for the same tophus at each of 2 visits and by each rater, respectively. Fifty-two tophi were measured in 13 subjects: 22 on the hand/wrist, 16 on the elbow, and 14 on the foot/ankle. The mean (+/- SD) difference in tophus areas between visits was -0.2 +/- 835 mm2 (95% CI -162 to 162 mm2) and the mean (+/- SD) APD was 29% +/- 33%. The mean (+/- SD) APD between raters was 32% +/- 27%. The largest variations in measurements were noted for elbow tophi and variations were least for well demarcated tophi on the hands. This simple and reproducible method can be easily utilized in clinical trials and in practice as a measure of efficacy of urate-lowering treatment in tophaceous gout. Among factors contributing to variability in these measurements were the anatomic site of tophi and rater experience with the method. Restriction of measurements to well circumscribed hand or foot tophi could improve reliability, but major changes, as expected with effective therapy, can clearly be documented with this simple technique.

  5. Quantitative evaluation of decay patterns on artificially weathered sandstone specimens

    NASA Astrophysics Data System (ADS)

    Prikryl, Richard

    2017-04-01

    Natural stone affected by weathering processes exhibits development of specific weathering forms / patterns. These features are controlled by numerous factors; however, their extent is generally considered to be proportional to weathering grade. The recent study focused on possible quantitative evaluation of the decay patterns on artificially weathered sandstones and on correlation of the extent of decay forms with conventionally used parameters such as weight loss or porosity increase. Macroscopically visible decay patterns were recorded after completion of certain number of cycles of freezing/thawing and/or salt crystallization applied to several types of building sandstones. By using prismatic specimens, the preservation of (1) corners, (2) edges, and (3) flat surfaces plus overall integrity of specimens were captured by digital photography. Individual photos were processed by means of image analysis software to quantify % loss of original shape (i.e. rounding of corners and edges, material loss on flat surfaces, etc.), and formation of cracks. Obtained data were correlated with results of non-destructive measurements of selected physical properties such as porosity, ultrasonic velocity or weight loss.

  6. Quantitative evaluation of statistical inference in resting state functional MRI.

    PubMed

    Yang, Xue; Kang, Hakmook; Newton, Allen; Landman, Bennett A

    2012-01-01

    Modern statistical inference techniques may be able to improve the sensitivity and specificity of resting state functional MRI (rs-fMRI) connectivity analysis through more realistic characterization of distributional assumptions. In simulation, the advantages of such modern methods are readily demonstrable. However quantitative empirical validation remains elusive in vivo as the true connectivity patterns are unknown and noise/artifact distributions are challenging to characterize with high fidelity. Recent innovations in capturing finite sample behavior of asymptotically consistent estimators (i.e., SIMulation and EXtrapolation - SIMEX) have enabled direct estimation of bias given single datasets. Herein, we leverage the theoretical core of SIMEX to study the properties of inference methods in the face of diminishing data (in contrast to increasing noise). The stability of inference methods with respect to synthetic loss of empirical data (defined as resilience) is used to quantify the empirical performance of one inference method relative to another. We illustrate this new approach in a comparison of ordinary and robust inference methods with rs-fMRI.

  7. Performance evaluation of IVC systems.

    PubMed

    Brandstetter, H; Scheer, M; Heinekamp, C; Gippner-Steppert, C; Loge, O; Ruprecht, L; Thull, B; Wagner, R; Wilhelm, P; Scheuber, H-P

    2005-01-01

    An expert Working Group was set up in December 2000 to develop recommendations for users and industry on the evaluation of proper function and operation of individually ventilated cage (IVC) systems. The full report of their recommendations is in two parts--'Part 1: Test Instructions' and 'Part 2: Evaluation Criteria'--both of which have been published in full on the Laboratory Animals Ltd website. They can be found at http://www.lal.org.uk/IVC/index.html. Evaluation of and feedback on the recommendations to further refine their use and scientific basis is encouraged. This Summary Report provides a brief overview of the background to the development of the full report and the issues it addresses.

  8. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  9. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604 Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  10. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Performance evaluations. 304.4 Section 304.4 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION, DEPARTMENT OF COMMERCE ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the...

  11. Improving Teacher Performance through Evaluation and Supervision.

    ERIC Educational Resources Information Center

    Drake, Jackson M.

    To guarantee an efficient educational system, effective evaluation and supervision of teacher performance are necessary. However, the evaluation of teacher performance presents two major problems: first, no clear definition or measure of effective teaching exists, and second, evaluation is perceived to have conflicting purposes, either as a…

  12. Using hybrid method to evaluate the green performance in uncertainty.

    PubMed

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  13. Consistency in Piano Performance Evaluation.

    ERIC Educational Resources Information Center

    Wapnick, Joel; And Others

    1993-01-01

    Reports on a study of the use of musical scores and rating scales by 80 pianists who listened to 21 trials of solo piano music. Found that the use of musical scores and rating scales did not improve interrater reliability. Discovered that the subjects were less consistent when evaluating slow musical pieces than faster pieces. (CFR)

  14. A novel quantitative approach for evaluating contact mechanics of meniscal replacements.

    PubMed

    Linder-Ganz, E; Elsner, J J; Danino, A; Guilak, F; Shterling, A

    2010-02-01

    One of the functions of the meniscus is to distribute contact forces over the articular surfaces by increasing the joint contact areas. It is widely accepted that total/partial loss of the meniscus increases the risk of joint degeneration. A short-term method for evaluating whether degenerative arthritis can be prevented or not would be to determine if the peak pressure and contact area coverage of the tibial plateau (TP) in the knee are restored at the time of implantation. Although several published studies already utilized TP contact pressure measurements as an indicator for biomechanical performance of allograft menisci, there is a paucity of a quantitative method for evaluation of these parameters in situ with a single effective parameter. In the present study, we developed such a method and used it to assess the load distribution ability of various meniscal implant configurations in human cadaveric knees (n=3). Contact pressures under the intact meniscus were measured under compression (1200 N, 0 deg flexion). Next, total meniscectomy was performed and the protocol was repeated with meniscal implants. Resultant pressure maps were evaluated for the peak pressure value, total contact area, and its distribution pattern, all with respect to the natural meniscus output. Two other measures--implant-dislocation and implant-impingement on the ligaments--were also considered. If any of these occurred, the score was zeroed. The total implant score was based on an adjusted calculation of the aforementioned measures, where the natural meniscus score was always 100. Laboratory experiments demonstrated a good correlation between qualitative and quantitative evaluations of the same pressure map outputs, especially in cases where there were contradicting indications between different parameters. Overall, the proposed approach provides a novel, validated method for quantitative assessment of the biomechanical performance of meniscal implants, which can be used in various

  15. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  16. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  17. Performance evaluation of generalized MSK

    NASA Astrophysics Data System (ADS)

    Galko, P.; Pasupathy, S.

    The computation of the performance of several optimal and suboptimal receivers for generalized MSK is discussed. The optimal receivers considered are Viterbi receivers and the optimal receivers based on a finite observation interval. Two suboptimal receivers, (1) optimized linear receivers based on finite observation intervals and (2) linear receivers based on approximating generalized MSK as an OQPSK signal, are considered as well. It is shown that the former receiver's performance may be computed exactly, while for the latter receiver it is possible to provide arbitrarily tight bounds on the performance. These analyses are illustrated by application to the two popular generalized MSK schemes of duobinary MSK (or FSOQ) and (1 + D)squared/4 MSK.

  18. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. (a) Preparation of performance reports. Use DD Form 2631, Performance Evaluation (Architect-Engineer), instead of SF 1421. (2) Prepare a...

  19. Gas turbine coatings eddy current quantitative and qualitative evaluation

    NASA Astrophysics Data System (ADS)

    Ribichini, Remo; Giolli, Carlo; Scrinzi, Erica

    2017-02-01

    Gas turbine blades (buckets) are among the most critical and expensive components of the engine. Buckets rely on protective coatings in order to withstand the harsh environment in which they operate. The thickness and the microstructure of coatings during the lifespan of a unit are fundamental to evaluate their fitness for service. A frequency scanning Eddy Current instrument can allow the measurement of the thickness and of physical properties of coatings in a Non-Destructive manner. The method employed relies on the acquisition of impedance spectra and on the inversion of the experimental data to derive the coating properties and structure using some assumptions. This article describes the experimental validation performed on several samples and real components in order to assess the performance of the instrument as a coating thickness gage. The application of the technique to support residual life assessment of serviced buckets is also presented.

  20. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  1. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries.

    PubMed

    Martín Noguerol, Teodoro; Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-08-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus.

  2. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    PubMed

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  3. Evaluating judge performance in sport.

    PubMed

    Looney, Marilyn A

    2004-01-01

    Many sports, such as, gymnastics, diving, ski jumping, and figure skating, use judges' scores to determine the winner of a competition. These judges use some type of rating scale when judging performances (e.g., figure skating: 0.0 - 6.0). Sport governing bodies have the responsibility of setting and enforcing quality control parameters for judge performance. Given the judging scandals in figure skating at the 1998 and 2002 Olympics, judge performance in sport is receiving greater scrutiny. The purpose of this article is to illustrate how results from Rasch analyses can be used to provide in-depth feedback to judges about their scoring patterns. Nine judges' scores for 20 pairs of figure skaters who competed at the 2002 Winter Olympics were analyzed using a four-faceted (skater pair ability, skating aspect difficulty, program difficulty, and judge severity) Rasch rating scale model that was not common to all judges. Fit statistics, the logical ordering of skating aspects, skating programs, and separation indices all indicated a good fit of the data to the model. The type of feedback that can be given to judges about their scoring pattern was illustrated for one judge (USA) whose performance was flagged as being unpredictable. Feedback included a detailed description of how the rating scale was used; for example, 10% of all marks given by the American judge were unexpected by the model (Z > |2|). Three figures illustrated differences between the judge's observed and expected marks arranged according to the pairs' skating order and final placement in the competition. Scores which may represent "nationalistic bias" or a skating order influence were flagged by looking at these figures. If sport governing bodies wish to improve the performance of their judges, they need to employ methods that monitor the internal consistency of each judge as a many-facet Rasch analysis does.

  4. Quantitative Integrated Evaluation in the Mars Basin, Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Tichelaar, B. W.; Detomo, R.

    2005-05-01

    Today's exploitation of hydrocarbons in the Deepwater Gulf of Mexico requires a subtle, sophisticated class of opportunities for which uncertainties must be quantified to reduce risk. The explorer is often faced with non-amplitude supported hydrocarbon accumulations, limitations of seismic imaging, and uncertainty in stratigraphy and hydrocarbon kitchens, all in an environment of still-maturing technology and rising drilling costs. However, many of the fundamental Exploration processes that drove the industry in the past in the Gulf of Mexico still apply today. Integration of these historically proven processes with each other and with new technologies, supported by a growing body of knowledge, has provided a significant new methodology for wildcat and near-field Exploration. Even in mature fields, additional opportunities are seldom characterized by unambiguous attributes of direct hydrocarbon indicators or amplitude support. Shell's Quantitative Integrated Evaluation process relies upon visualization of integrated volume-based stratigraphic models of rock and fluid properties, and by relating these properties to measured and predicted seismic responses. An attribute referred to as the Differential Generalized Attribute, which summarizes the differences between multiple scenario response predictions and actual measured data, can then be used to distinguish likely scenarios from unlikely scenarios. This methodology allows competing scenarios to be rapidly tested against the data, and is built upon proprietary knowledge of the physical processes and relationships that likely drive vertical and lateral variation in these models. We will demonstrate the methodology by showing a portion of the Mars Basin and describing the integrated capability that is emplaced at the Exploration phase, and matured throughout the Appraisal, Development and Production life cycle of a basin discovery.

  5. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  6. Predicting Team Performance through Human Behavioral Sensing and Quantitative Workflow Instrumentation

    DTIC Science & Technology

    2016-07-27

    on team performance . Proceedings of the Human Factors and Ergonomics Society, Seattle, WA, pp. 630-634 (1993) 14. Andres, H.P. The impact of...Predicting Team Performance Through Human Behavioral Sensing and Quantitative Workflow Instrumentation Matthew Daggett1, Kyle O’Brien1, Michael...study of human -system interactions, and joint qualitative- quantitative methodologies are being developed to improve human performance characterization

  7. A new performance evaluation tool

    SciTech Connect

    Kindl, F.H.

    1996-12-31

    The paper describes a Steam Cycle Diagnostic Program (SCDP), that has been specifically designed to respond to the increasing need of electric power generators for periodic performance monitoring, and quick identification of the causes for any observed increase in fuel consumption. There is a description of program objectives, modeling and test data inputs, results, underlying program logic, validation of program accuracy by comparison with acceptance test quality data, and examples of program usage.

  8. GPS User Equipment Performance Evaluation.

    DTIC Science & Technology

    1979-11-01

    Unit Device Controller Assembly ( UDCA ) * Satellite Signal Generator Assembly (SSGA) • Dynanic Frequency Synthesizer Assembly (DFSA) * Jamming...Post Run Data (PR)) under operator control. The UDCA performs the following primary functions: * Receive digital data from I)I’A, buffer and transfer to...collected by the 1IJKA, reformatted, and sent to the I)IA, and vice versa, via a data bus interface. The UDCA consists of the following elcliiti S

  9. Lubricant Evaluation and Performance 2

    DTIC Science & Technology

    1994-02-01

    POWER DIRECTORATE WRIGHT LABORATORY AIR FORCE MATERIEL COMMAND WRIGHT-PATrERSON AIR FORCE BASE , OHIO 45433-7103 94 5 10 011 NOTICE When government...INTRODUCTION I R DEVELOPMENT OF IMPROVED METHODS FOR MEASURING LUBRICANT PERFORMANCE 3 1 OXIDATIVE STABILITY OF ESTER BASE LUBRICANTS 3 a. Introduction 3 b...7) Conclusions 134 i. Stability Testing of Cyclophosphazene Based Fluids 134 (1) Introduction 134 (2) Effect of Metal Specimens 134 (3) Effect of a

  10. Quantitative performance metrics for robustness in circadian rhythms.

    PubMed

    Bagheri, Neda; Stelling, Jörg; Doyle, Francis J

    2007-02-01

    Sensitivity analysis provides key measures that aid in unraveling the design principles responsible for the robust performance of biological networks. Such metrics allow researchers to investigate comprehensively model performance, to develop more realistic models, and to design informative experiments. However, sensitivity analysis of oscillatory systems focuses on period and amplitude characteristics, while biologically relevant effects on phase are neglected. Here, we introduce a novel set of phase-based sensitivity metrics for performance: period, phase, corrected phase and relative phase. Both state- and phase-based tools are applied to free-running Drosophila melanogaster and Mus musculus circadian models. Each metric produces unique sensitivity values used to rank parameters from least to most sensitive. Similarities among the resulting rank distributions strongly suggest a conservation of sensitivity with respect to parameter function and type. A consistent result, for instance, is that model performance of biological oscillators is more sensitive to global parameters than local (i.e. circadian specific) parameters. Discrepancies among these distributions highlight the individual metrics' definition of performance as specific parametric sensitivity values depend on the defined metric, or output. An implementation of the algorithm in MATLAB (Mathworks, Inc.) is available from the authors. Supplementary Data are available at Bioinformatics online.

  11. Quantitative evaluation of all hexamers as exonic splicing elements.

    PubMed

    Ke, Shengdong; Shang, Shulian; Kalachikov, Sergey M; Morozova, Irina; Yu, Lin; Russo, James J; Ju, Jingyue; Chasin, Lawrence A

    2011-08-01

    We describe a comprehensive quantitative measure of the splicing impact of a complete set of RNA 6-mer sequences by deep sequencing successfully spliced transcripts. All 4096 6-mers were substituted at five positions within two different internal exons in a 3-exon minigene, and millions of successfully spliced transcripts were sequenced after transfection of human cells. The results allowed the assignment of a relative splicing strength score to each mutant molecule. The effect of 6-mers on splicing often depended on their location; much of this context effect could be ascribed to the creation of different overlapping sequences at each site. Taking these overlaps into account, the splicing effect of each 6-mer could be quantified, and 6-mers could be designated as enhancers (ESEseqs) and silencers (ESSseqs), with an ESRseq score indicating their strength. Some 6-mers exhibited positional bias relative to the two splice sites. The distribution and conservation of these ESRseqs in and around human exons supported their classification. Predicted RNA secondary structure effects were also seen: Effective enhancers, silencers and 3' splice sites tend to be single stranded, and effective 5' splice sites tend to be double stranded. 6-mers that may form positive or negative synergy with another were also identified. Chromatin structure may also influence the splicing enhancement observed, as a good correspondence was found between splicing performance and the predicted nucleosome occupancy scores of 6-mers. This approach may prove of general use in defining nucleic acid regulatory motifs, substitute for functional SELEX in most cases, and provide insights about splicing mechanisms.

  12. INTEGRATED WATER TREATMENT SYSTEM PERFORMANCE EVALUATION

    SciTech Connect

    SEXTON RA; MEEUWSEN WE

    2009-03-12

    This document describes the results of an evaluation of the current Integrated Water Treatment System (IWTS) operation against design performance and a determination of short term and long term actions recommended to sustain IWTS performance.

  13. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  14. Infrared radiometric technique for rapid quantitative evaluation of heat flux distribution over large areas

    NASA Technical Reports Server (NTRS)

    Glazer, Stuart; Siebes, Georg

    1989-01-01

    This paper describes a novel approach for rapid, quantitative measurement of spatially distributed heat flux incident on a plane. The technique utilizes the spatial temperature distribution on an opaque thin film at the location of interest, as measured by an imaging infrared radiometer. Knowledge of film radiative properties, plus quantitative estimates of convection cooling permit the steady state energy balance at any location on the film sheet to be solved for the incident heat flux. Absolute accuracies on the order of 10-15 percent have been obtained in tests performed in air. The method is particularly useful for evaluation of spatial heat flux uniformity from distributed heat sources over large areas. It has recently been used in several applications at the Jet Propulsion Laboratory, including flux uniformity measurements from large distributed quartz lamp arrays used during thermal vacuum testing of several spacecraft components, and flux mapping of a low power NdYg laser beam.

  15. S-191 sensor performance evaluation

    NASA Technical Reports Server (NTRS)

    Hughes, C. L.

    1975-01-01

    A final analysis was performed on the Skylab S-191 spectrometer data received from missions SL-2, SL-3, and SL-4. The repeatability and accuracy of the S-191 spectroradiometric internal calibration was determined by correlation to the output obtained from well-defined external targets. These included targets on the moon and earth as well as deep space. In addition, the accuracy of the S-191 short wavelength autocalibration was flight checked by correlation of the earth resources experimental package S-191 outputs and the Backup Unit S-191 outputs after viewing selected targets on the moon.

  16. Evaluation of solar pond performance

    SciTech Connect

    Wittenberg, L.J.

    1980-01-01

    The City of Miamisburg, Ohio, constructed during 1978 a large, salt-gradient solar pond as part of its community park development project. The thermal energy stored in the pond is being used to heat an outdoor swimming pool in the summer and an adjacent recreational building during part of the winter. This solar pond, which occupies an area of 2020 m/sup 2/ (22,000 sq. ft.), was designed from experience obtained at smaller research ponds located at Ohio State University, the University of New Mexico and similar ponds operated in Israel. During the summer of 1979, the initial heat (40,000 kWh, 136 million Btu) was withdrawn from the solar pond to heat the outdoor swimming pool. All of the data collection systems were installed and functioned as designed so that operational data were obtained. The observed performance of the pond was compared with several of the predicted models for this type of pond. (MHR)

  17. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  18. Diffusion tensor imaging with quantitative evaluation and fiber tractography of lumbar nerve roots in sciatica.

    PubMed

    Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang

    2015-04-01

    To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed

    Hertzberg, Richard C; Teuschler, Linda K

    2002-12-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions.

  20. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason.

  1. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images

    NASA Astrophysics Data System (ADS)

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-01

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05) Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  2. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  3. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images.

    PubMed

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-21

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05); Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  4. Laser Plasma Microthruster Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Luke, James R.; Phipps, Claude R.

    2003-05-01

    The micro laser plasma thruster (μLPT) is a sub-kilogram thruster that is capable of meeting the Air Force requirements for the Attitude Control System on a 100-kg class small satellite. The μLPT uses one or more 4W diode lasers to ablate a solid fuel, producing a jet of hot gas or plasma which creates thrust with a high thrust/power ratio. A pre-prototype continuous thrust experiment has been constructed and tested. The continuous thrust experiment uses a 505 mm long continuous loop fuel tape, which consists of a black laser-absorbing fuel material on a transparent plastic substrate. When the laser is operated continuously, the exhaust plume and thrust vector are steered in the direction of the tape motion. Thrust steering can be avoided by pulsing the laser. A torsion pendulum thrust stand has been constructed and calibrated. Many fuel materials and substrates have been tested. Best performance from a non-energetic fuel material was obtained with black polyvinyl chloride (PVC), which produced an average of 70 μN thrust and coupling coefficient (Cm) of 190 μN/W. A proprietary energetic material was also tested, in which the laser initiates a non-propagating detonation. This material produced 500 μN of thrust.

  5. [Evaluation of dental plaque by quantitative digital image analysis system].

    PubMed

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (P<0.01). Intraclass correlation coefficients of the tooth area and plaque area which two researchers used the software to calculate were 0.956 and 0.930 (P<0.01).The Bland-Altman analysis chart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  6. Distance estimation from acceleration for quantitative evaluation of Parkinson tremor.

    PubMed

    Jeon, Hyoseon; Kim, Sang Kyong; Jeon, BeomSeok; Park, Kwang Suk

    2011-01-01

    The purpose of this paper is to assess Parkinson tremor estimating actual distance amplitude. We propose a practical, useful and simple method for evaluating Parkinson tremor with distance value. We measured resting tremor of 7 Parkinson Disease (PD) patients with triaxial accelerometer. Resting tremor of participants was diagnosed by Unified Parkinson's Disease Rating Scale (UPDRS) by neurologist. First, we segmented acceleration signal during 7 seconds from recorded data. To estimate a displacement of tremor, we performed double integration from the acceleration. Prior to double integration, moving average method was used to reduce an error of integral constant. After estimation of displacement, we calculated tremor distance during 1s from segmented signal using Euclidean distance. We evaluated the distance values compared with UPDRS. Averaged moving distance during 1 second corresponding to UPDRS 1 was 11.52 mm, that of UPDRS 2 was 33.58 mm and tremor distance of UPDRS 3 was 382.22 mm. Estimated moving distance during 1s was proportional to clinical rating scale--UPDRS.

  7. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  8. Clinical evaluator reliability for quantitative and manual muscle testing measures of strength in children.

    PubMed

    Escolar, D M; Henricson, E K; Mayhew, J; Florence, J; Leshner, R; Patel, K M; Clemens, P R

    2001-06-01

    Measurements of muscle strength in clinical trials of Duchenne muscular dystrophy have relied heavily on manual muscle testing (MMT). The high level of intra- and interrater variability of MMT compromises clinical study results. We compared the reliability of 12 clinical evaluators in performing MMT and quantitative muscle testing (QMT) on 12 children with muscular dystrophy. QMT was reliable, with an interclass correlation coefficient (ICC) of >0.9 for biceps and grip strength, and >0.8 for quadriceps strength. Training of both subjects and evaluators was easily accomplished. MMT was not as reliable, and required repeated training of evaluators to bring all groups to an ICC >0.75 for shoulder abduction, elbow and hip flexion, knee extension, and ankle dorsiflexion. We conclude that QMT shows greater reliability and is easier to implement than MMT. Consequently, QMT will be a superior measure of strength for use in pediatric, neuromuscular, multicenter clinical trials.

  9. Evaluation of board performance in Iran's universities of medical sciences.

    PubMed

    Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad

    2014-10-01

    The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Participants believed that the boards had no acceptable performance for a long time.RESULTS also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards' resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process.

  10. Theory and Practice on Teacher Performance Evaluation

    ERIC Educational Resources Information Center

    Yonghong, Cai; Chongde, Lin

    2006-01-01

    Teacher performance evaluation plays a key role in educational personnel reform, so it has been an important yet difficult issue in educational reform. Previous evaluations on teachers failed to make strict distinction among the three dominant types of evaluation, namely, capability, achievement, and effectiveness. Moreover, teacher performance…

  11. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets.

    PubMed

    Johnson, Michael A; Chiang, Ranyee A

    2015-08-01

    Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance-usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1-3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7-9 hr/week). Moderate health gains may be achieved with various performance-usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance-usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact.

  12. Evaluating IPMN and pancreatic carcinoma utilizing quantitative histopathology.

    PubMed

    Glazer, Evan S; Zhang, Hao Helen; Hill, Kimberly A; Patel, Charmi; Kha, Stephanie T; Yozwiak, Michael L; Bartels, Hubert; Nafissi, Nellie N; Watkins, Joseph C; Alberts, David S; Krouse, Robert S

    2016-10-01

    Intraductal papillary mucinous neoplasms (IPMN) are pancreatic lesions with uncertain biologic behavior. This study sought objective, accurate prediction tools, through the use of quantitative histopathological signatures of nuclear images, for classifying lesions as chronic pancreatitis (CP), IPMN, or pancreatic carcinoma (PC). Forty-four pancreatic resection patients were retrospectively identified for this study (12 CP; 16 IPMN; 16 PC). Regularized multinomial regression quantitatively classified each specimen as CP, IPMN, or PC in an automated, blinded fashion. Classification certainty was determined by subtracting the smallest classification probability from the largest probability (of the three groups). The certainty function varied from 1.0 (perfectly classified) to 0.0 (random). From each lesion, 180 ± 22 nuclei were imaged. Overall classification accuracy was 89.6% with six unique nuclear features. No CP cases were misclassified, 1/16 IPMN cases were misclassified, and 4/16 PC cases were misclassified. Certainty function was 0.75 ± 0.16 for correctly classified lesions and 0.47 ± 0.10 for incorrectly classified lesions (P = 0.0005). Uncertainty was identified in four of the five misclassified lesions. Quantitative histopathology provides a robust, novel method to distinguish among CP, IPMN, and PC with a quantitative measure of uncertainty. This may be useful when there is uncertainty in diagnosis.

  13. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course

    PubMed Central

    Flanagan, K. M.; Einarson, J.

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre–post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student’s math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student’s grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide “instructor actions” from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. PMID:28798209

  14. Impact of quantitative feedback and benchmark selection on radiation use by cardiologists performing cardiac angiography.

    PubMed

    Smith, Ian R; Cameron, James; Brighouse, Russell D; Ryan, Claire M; Foster, Kelley A; Rivers, John T

    2013-06-01

    Audit of and feedback on both group and individual data provided immediately after the point of care and compared with realistic benchmarks of excellence have been demonstrated to drive change. This study sought to evaluate the impact of immediate benchmarked quantitative case-based performance feedback on the clinical practice of cardiologists practicing at a private hospital in Brisbane, Australia. The participating cardiologists were assigned to one of two groups: Group 1 received patient and procedural details for review and Group 2 received Group 1 data plus detailed radiation data relating to the procedures and comparative benchmarks. In Group 2, Linear-by-Linear Association analysis suggests a link between change in radiation use and initial radiation dose category (p=0.014) with only those initially 'challenged' by the benchmarks showing improvement. Those not 'challenged' by the benchmarks deteriorated in performance compared with those starting well below the benchmarks showing greatest increase in radiation use. Conversely, those blinded to their radiation use (Group 1) showed general improvement in radiation use throughout the study compared with those performing initially close to the benchmarks showing greatest improvement. This study shows that use of non-challenging benchmarks in case-based radiation risk feedback does not promote a reduction in radiation use; indeed, it may contribute to increased doses. Paradoxically, cardiologists who are aware of performance monitoring but blinded to individual case data appear to maintain, if not reduce, their radiation use.

  15. Evaluation of static and dynamic perfusion cardiac computed tomography for quantitation and classification tasks.

    PubMed

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R; La Riviere, Patrick J; Alessio, Adam M

    2016-04-01

    Cardiac computed tomography (CT) acquisitions for perfusion assessment can be performed in a dynamic or static mode. Either method may be used for a variety of clinical tasks, including (1) stratifying patients into categories of ischemia and (2) using a quantitative myocardial blood flow (MBF) estimate to evaluate disease severity. In this simulation study, we compare method performance on these classification and quantification tasks for matched radiation dose levels and for different flow states, patient sizes, and injected contrast levels. Under conditions simulated, the dynamic method has low bias in MBF estimates (0 to [Formula: see text]) compared to linearly interpreted static assessment (0.45 to [Formula: see text]), making it more suitable for quantitative estimation. At matched radiation dose levels, receiver operating characteristic analysis demonstrated that the static method, with its high bias but generally lower variance, had superior performance ([Formula: see text]) in stratifying patients, especially for larger patients and lower contrast doses [area under the curve [Formula: see text] to 96 versus 0.86]. We also demonstrate that static assessment with a correctly tuned exponential relationship between the apparent CT number and MBF has superior quantification performance to static assessment with a linear relationship and to dynamic assessment. However, tuning the exponential relationship to the patient and scan characteristics will likely prove challenging. This study demonstrates that the selection and optimization of static or dynamic acquisition modes should depend on the specific clinical task.

  16. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  17. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  18. Toward objective and quantitative evaluation of imaging systems using images of phantoms.

    PubMed

    Gagne, Robert M; Gallas, Brandon D; Myers, Kyle J

    2006-01-01

    The use of imaging phantoms is a common method of evaluating image quality in the clinical setting. These evaluations rely on a subjective decision by a human observer with respect to the faintest detectable signal(s) in the image. Because of the variable and subjective nature of the human-observer scores, the evaluations manifest a lack of precision and a potential for bias. The advent of digital imaging systems with their inherent digital data provides the opportunity to use techniques that do not rely on human-observer decisions and thresholds. Using the digital data, signal-detection theory (SDT) provides the basis for more objective and quantitative evaluations which are independent of a human-observer decision threshold. In a SDT framework, the evaluation of imaging phantoms represents a "signal-known-exactly/background-known-exactly" ("SKE/ BKE") detection task. In this study, we compute the performance of prewhitening and nonprewhitening model observers in terms of the observer signal-to-noise ratio (SNR) for these "SK E/BKE" tasks. We apply the evaluation methods to a number of imaging systems. For example, we use data from a laboratory implementation of digital radiography and from a full-field digital mammography system in a clinical setting. In addition, we make a comparison of our methods to human-observer scoring of a set of digital images of the CDMAM phantom available from the internet (EUREF-European Reference Organization). In the latter case, we show a significant increase in the precision of the quantitative methods versus the variability in the scores from human observers on the same set of images. As regards bias, the performance of a model observer estimated from a finite data set is known to be biased. In this study, we minimize the bias and estimate the variance of the observer SNR using statistical resampling techniques, namely, "bootstrapping" and "shuffling" of the data sets. Our methods provide objective and quantitative evaluation of

  19. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  20. Evaluation of Satellite Quantitative Precipitation Estimates (QPEs) Products

    NASA Astrophysics Data System (ADS)

    Prat, O. P.; Nelson, B. R.

    2016-12-01

    In this work, we conduct a long-term assessment of the different Satellite based precipitation products from the Reference Environmental Data Records (PERSIANN-CDR; GPCP; CMORPH-CDR) and from the PMM/GPM suite of products (TMPA, TMPA-RT, IMERG). PERSIANN-CDR is a 30-year record of daily-adjusted global precipitation. GPCP is an approximately 30-year record of monthly and pentad adjusted global precipitation and 17-year record of daily-adjusted global precipitation. CMORPH-CDR is a 17-year record of daily and sub-daily adjusted global precipitation. The products inter-comparisons are performed at various temporal and spatial scales over the concurrent period of record. The evaluation of the different products will include trend analysis and comparison with in-situ data sets from the Global Historical Climatology Network (GHCN-Daily). In addition, we will compare the datasets ability to capture global precipitation patterns and local extreme precipitation events in order to derive a detailed picture of each product strengths and weaknesses.

  1. Taste characteristics based quantitative and qualitative evaluation of ginseng adulteration.

    PubMed

    Cui, Shaoqing; Yang, Liangcheng; Wang, Jun; Wang, Xinlei

    2015-05-01

    Adulteration of American ginseng with Asian ginseng is common and has caused much damage to customers. Panel evaluation is commonly used to determine their differences, but it is subjective. Chemical instruments are used to identify critical compounds but they are time-consuming and expensive. Therefore, a fast, accurate and convenient method is required. A taste sensing system, combining both advantages of the above two technologies, provides a novel potential technology for determining ginseng adulteration. The aim is to build appropriate models to distinguish and predict ginseng adulteration by using taste characteristics. It was found that ginsenoside contents decreased linearly (R(2) = 0.92) with mixed ratios. A bioplot of principal component analysis showed a good performance in classing samples with the first two principal components reaching 89.7%, and it was noted that it was the bitterness, astringency, aftertaste of bitterness and astringency, and saltiness leading the successful determination. After factor screening, bitterness, astringency, aftertaste of bitterness and saltiness were employed to build latent models. Tastes of bitterness, astringency and aftertaste bitterness were demonstrated to be most effective in predicting adulteration ratio, mean while, bitterness and aftertaste bitterness turned out to be most effective in ginsenoside content prediction. Taste characteristics of adulterated ginsengs, considered as taste fingerprint, can provide novel guidance for determining the adulteration of American and Asian ginseng. © 2014 Society of Chemical Industry.

  2. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  3. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development.

  4. Method of quantitative evaluation of decision maker's preferences

    NASA Astrophysics Data System (ADS)

    Duchaczek, Artur; Skorupka, Dariusz

    2017-07-01

    In the optimisation process the validity coefficients γi (the so-called weights) of each criterion allow to take into account individual preferences of a decision maker. In this paper there is presented the original method of calculating these coefficients. The application of the presented method permits quantitative consideration of the actual preferences of a decision maker on the basis of a fairly simple calculation method, and not only the intuition of the decision maker.

  5. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  6. Conductor gestures influence evaluations of ensemble performance

    PubMed Central

    Morrison, Steven J.; Price, Harry E.; Smedley, Eric M.; Meals, Cory D.

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor’s gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble’s articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble’s performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  7. Quantitative performance of a quadrupole-orbitrap-MS in targeted LC-MS determinations of small molecules.

    PubMed

    Grund, Baptiste; Marvin, Laure; Rochat, Bertrand

    2016-05-30

    High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary.

  8. A quantitative approach to evaluate image quality of whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Kneepkens, R.; Vrijnsen, J.; Vossen, D.; Abels, E.; Hulsken, B.

    2016-01-01

    Context: The quality of images produced by whole slide imaging (WSI) scanners has a direct influence on the readers’ performance and reliability of the clinical diagnosis. Therefore, WSI scanners should produce not only high quality but also consistent quality images. Aim: We aim to evaluate reproducibility of WSI scanners based on the quality of images produced over time and among multiple scanners. The evaluation is independent of content or context of test specimen. Methods: The ultimate judge of image quality is a pathologist, however, subjective evaluations are heavily influenced by the complexity of a case and subtle variations introduced by a scanner can be easily overlooked. Therefore, we employed a quantitative image quality assessment method based on clinically relevant parameters, such as sharpness and brightness, acquired in a survey of pathologists. The acceptable level of quality per parameter was determined in a subjective study. The evaluation of scanner reproducibility was conducted with Philips Ultra-Fast Scanners. A set of 36 HercepTest™ slides were used in three sub-studies addressing variations due to systems and time, producing 8640 test images for evaluation. Results: The results showed that the majority of images in all the sub-studies are within the acceptable quality level; however, some scanners produce higher quality images more often than others. The results are independent of case types, and they match our perception of quality. Conclusion: The quantitative image quality assessment method was successfully applied in the HercepTest™ slides to evaluate WSI scanner reproducibility. The proposed method is generic and applicable to any other types of slide stains and scanners. PMID:28197359

  9. Quantitative evaluation of personal exposure to UV radiation of workers and general public.

    PubMed

    Sisto, R; Borra, M; Casale, G R; Militello, A; Siani, A M

    2009-12-01

    Due to meteorological conditions variability and to the variability of exposure patterns, which can be largely different during a working day, personal dosemeters use can be necessary to obtain a correct quantitative evaluation of the radiation dose absorbed by an exposed worker. Different classes of personal dosemeters exist and, among them, electronic dosemeters and polysulphone film dosemeters. An experimental campaign is presented conduced in a cultivated area of Tuscany and some aspects are discussed about an experimental campaign performed on a population of volunteers on a central Italy beach near Rome. The aim of the present work is to show some relevant issues in a dosimetric approach to the exposure evaluation of outdoor workers and, in general, of the public during recreational activities.

  10. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets

    PubMed Central

    Chiang, Ranyee A.

    2015-01-01

    Background Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. Objectives This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. Methods We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Results Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance–usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1–3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7–9 hr/week). Conclusions Moderate health gains may be achieved with various performance–usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance–usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact. Citation

  11. Methodology for Evaluation of Diagnostic Performance

    SciTech Connect

    Metz, Charles E.

    2003-02-19

    The proliferation of expensive technology in diagnostic medicine demands objective, meaningful assessments of diagnostic performance. Receiver Operating Characteristic (ROC) analysis is now recognized widely as the best approach to the task of measuring and specifying diagnostic accuracy (Metz, 1978; Swets and Pickett, 1982; Beck and Schultz, 1986; Metz, 1986; Hanley, 1989; Zweig and Campbell, 1993), which is defined as the extent to which diagnoses agree with actual states of health or disease (Fryback and Thornbury, 1991; National Council on Radiation Protection and Measurements, 1995). The primary advantage of ROC analysis over alternative methodologies is that it separates differences among diagnostic decisions that are due to actual differences in discrimination capacity from those that are due to decision-threshold effects (e.g., ''under-reading'' or ''over-reading''). An ROC curve measures diagnostic accuracy by displaying True Positive Fraction (TPF: the fraction of patients actually having the disease in question that is diagnosed correctly as ''positive'') as a function of False Positive Fraction (FPF: the fraction of patients actually without the disease that is diagnosed incorrectly as ''positive''). Different points on the ROC curve--i.e., different compromises between the specificity and the sensitivity of a diagnostic test, for a given inherent accuracy--can be achieved by adopting different critical values of the diagnostic test's ''decision variable'' --e.g., the observer's degree of confidence that each case is positive or negative in a diagnostic image-reading task, or the numerical value of the result of a quantitative diagnostic test. ROC techniques have been used to measure and specify the diagnostic performance of medical imaging systems since the early 1970s, and the needs that arise in this application have spurred a variety of new methodological developments. In particular, substantial progress has been made in ROC curve fitting and in

  12. Performance Evaluation of Undulator Radiation at CEBAF

    SciTech Connect

    Chuyu Liu, Geoffrey Krafft, Guimei Wang

    2010-05-01

    The performance of undulator radiation (UR) at CEBAF with a 3.5 m helical undulator is evaluated and compared with APS undulator-A radiation in terms of brilliance, peak brilliance, spectral flux, flux density and intensity distribution.

  13. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  14. Promising quantitative nondestructive evaluation techniques for composite materials

    NASA Technical Reports Server (NTRS)

    Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    Some recent results in the area of the ultrasonic, acoustic emission, thermographic, and acousto-ultrasonic NDE of composites are reviewed. In particular, attention is given to the progress in the use of ultrasonic attenuation, acoustic emission (parameter) delay, liquid-crystal thermography, and the stress wave factor in structural integrity monitoring of composite materials. The importance of NDE flaw significance characterizations is emphasized since such characterizations can directly indicate the appropriate NDE technique sensitivity requirements. The role of the NDE of flawed composites with and without overt defects in establishing quantitative accept/reject criteria for structural integrity assessment is discussed.

  15. Quantitative evaluation of atrial radio frequency ablation using intracardiac shear-wave elastography.

    PubMed

    Kwiecinski, Wojciech; Provost, Jean; Dubois, Rémi; Sacher, Frédéric; Haïssaguerre, Michel; Legros, Mathieu; Nguyen-Dinh, An; Dufait, Rémi; Tanter, Mickaël; Pernot, Mathieu

    2014-11-01

    Radio frequency catheter ablation (RFCA) is a well-established clinical procedure for the treatment of atrial fibrillation (AF) but suffers from a low single-procedure success rate. Recurrence of AF is most likely attributable to discontinuous or nontransmural ablation lesions. Yet, despite this urgent clinical need, there is no clinically available imaging modality that can reliably map the lesion transmural extent in real time. In this study, the authors demonstrated the feasibility of shear-wave elastography (SWE) to map quantitatively the stiffness of RFCA-induced thermal lesions in cardiac tissues in vitro and in vivo using an intracardiac transducer array. SWE was first validated in ex vivo porcine ventricular samples (N = 5). Both B-mode imaging and SWE were performed on normal cardiac tissue before and after RFCA. Areas of the lesions were determined by tissue color change with gross pathology and compared against the SWE stiffness maps. SWE was then performed in vivo in three sheep (N = 3). First, the stiffness of normal atrial tissues was assessed quantitatively as well as its variation during the cardiac cycle. SWE was then performed in atrial tissue after RFCA. A large increase in stiffness was observed in ablated ex vivo regions (average shear modulus across samples in normal tissue: 22 ± 5 kPa, average shear-wave speed (ct): 4.5 ± 0.4 m s(-1) and in determined ablated zones: 99 ± 17 kPa, average ct: 9.0 ± 0.5 m s(-1) for a mean shear modulus increase ratio of 4.5 ± 0.9). In vivo, a threefold increase of the shear modulus was measured in the ablated regions, and the lesion extension was clearly visible on the stiffness maps. By its quantitative and real-time capabilities, Intracardiac SWE is a promising intraoperative imaging technique for the evaluation of thermal ablation during RFCA.

  16. Improvement of Automotive Part Supplier Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Kongmunee, Chalermkwan; Chutima, Parames

    2016-05-01

    This research investigates the problem of the part supplier performance evaluation in a major Japanese automotive plant in Thailand. Its current evaluation scheme is based on experiences and self-opinion of the evaluators. As a result, many poor performance suppliers are still considered as good suppliers and allow to supply parts to the plant without further improvement obligation. To alleviate this problem, the brainstorming session among stakeholders and evaluators are formally conducted. The result of which is the appropriate evaluation criteria and sub-criteria. The analytical hierarchy process is also used to find suitable weights for each criteria and sub-criteria. The results show that a newly developed evaluation method is significantly better than the previous one in segregating between good and poor suppliers.

  17. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  18. Building Leadership Talent through Performance Evaluation

    ERIC Educational Resources Information Center

    Clifford, Matthew

    2015-01-01

    Most states and districts scramble to provide professional development to support principals, but "principal evaluation" is often lost amid competing priorities. Evaluation is an important method for supporting principal growth, communicating performance expectations to principals, and improving leadership practice. It provides leaders…

  19. Neglected Areas in Evaluating Writing Performance.

    ERIC Educational Resources Information Center

    Keech, Catharine Lucas

    The heavy concentration of time and funds to measure writing performance is the major reason other areas deserving scrutiny are so often neglected by evaluators. Three failings typical of the field of writing assessment as it is conducted for the purpose of program evaluation are: (1) a failure to view writing as a multiple construct; (2) a…

  20. Reference Service Standards, Performance Criteria, and Evaluation.

    ERIC Educational Resources Information Center

    Schwartz, Diane G.; Eakin, Dottie

    1986-01-01

    Describes process by which reference service standards were developed at a university medical library and their impact on the evaluation of work of librarians. Highlights include establishment of preliminary criteria, literature review, reference service standards, performance evaluation, peer review, and staff development. Checklist of reference…

  1. Automatic Singing Performance Evaluation for Untrained Singers

    NASA Astrophysics Data System (ADS)

    Cao, Chuan; Li, Ming; Wu, Xiao; Suo, Hongbin; Liu, Jian; Yan, Yonghong

    In this letter, we present an automatic approach of objective singing performance evaluation for untrained singers by relating acoustic measurements to perceptual ratings of singing voice quality. Several acoustic parameters and their combination features are investigated to find objective correspondences of the perceptual evaluation criteria. Experimental results show relative strong correlation between perceptual ratings and the combined features and the reliability of the proposed evaluation system is tested to be comparable to human judges.

  2. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  3. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  4. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism.

  5. Quantitative evaluation of bioorthogonal chemistries for surface functionalization of nanoparticles.

    PubMed

    Feldborg, Lise N; Jølck, Rasmus I; Andresen, Thomas L

    2012-12-19

    We present here a highly efficient and chemoselective liposome functionalization method based on oxime bond formation between a hydroxylamine and an aldehyde-modified lipid component. We have conducted a systematic and quantitative comparison of this new approach with other state-of-the-art conjugation reactions in the field. Targeted liposomes that recognize overexpressed receptors or antigens on diseased cells have great potential in therapeutic and diagnostic applications. However, chemical modifications of nanoparticle surfaces by postfunctionalization approaches are less effective than in solution and often not high-yielding. In addition, the conjugation efficiency is often challenging to characterize and therefore not addressed in many reports. We present here an investigation of PEGylated liposomes functionalized with a neuroendocrine tumor targeting peptide (TATE), synthesized with a variety of functionalities that have been used for surface conjugation of nanoparticles. The reaction kinetics and overall yield were quantified by HPLC. Reactions were conducted in solution as well as by postfunctionalization of liposomes in order to study the effects of steric hindrance and possible affinity between the peptide and the liposome surface. These studies demonstrate the importance of choosing the correct chemistry in order to obtain a quantitative surface functionalization of liposomes.

  6. Course Evaluation. II: Interpretation of Student Performance on Evaluative Tests

    ERIC Educational Resources Information Center

    Aikenhead, Glen S.

    1974-01-01

    Reports the results of a comparative evaluation of Harvard Project Physics (HPP) and non-HPP student performance, and demonstrates the ability of a new test construction paradigm to generate valuable feedback for curriculum developers, teachers, and students. (JR)

  7. Principal Performance Areas and Principal Evaluation.

    ERIC Educational Resources Information Center

    Fletcher, Thomas E.; McInerney, William D.

    1995-01-01

    Summarizes a study that surveyed Indiana superintendents and examined their principal-evaluation instruments. Superintendents were asked which of 21 performance domains were most important and whether these were currently being assessed. Respondents generally agreed that all 21 performance domains identified by the National Policy Board for…

  8. Quantitative analysis and chromatographic fingerprinting for the quality evaluation of Scutellaria baicalensis Georgi using capillary electrophoresis.

    PubMed

    Yu, Ke; Gong, Yifei; Lin, Zhongying; Cheng, Yiyu

    2007-01-17

    Quantitative analysis and chromatographic fingerprinting for the quality evaluation of a Chinese herb Scutellaria baicalensis Georgi using capillary electrophoresis (CE) technique was developed. The separation was performed with a 50.0cm (42.0cm to the detector window)x75mum i.d. fused-silica capillary, and the CE fingerprint condition was optimized using the combination of central composite design and multivariate analysis. The optimized buffer system containing 15mM borate, 40mM phosphate, 15mM SDS, 15% (v/v) acetonitrile and 7.5% (v/v) 2-propanol was employed for the method development, and the baseline separation was achieved within 15min. The determination of the major active components (Baicalin, Baicalein and Wogonin) was carried out using the optimized CE condition. Good linear relationships were provided over the investigated concentration ranges (the values of R(2): 0.9997 for Baicalin, 0.9992 for Baicalein, and 0.9983 for Wogonin, respectively). The average recoveries of these target components ranged between 96.1-105.6%, 98.6-105.2%, and 96.3-105.0%, respectively. CE fingerprints combined with the quantitative analysis can be used for the quality evaluation of S. baicalensis.

  9. Evaluating the performance of clinical dietitians.

    PubMed

    Gates, G E; Holdt, C S

    1993-05-01

    The purposes of this study were to describe how clinical managers evaluate the performance of clinical dietitians and to examine managers' opinions about performance appraisal. Managers from 55 acute-care hospitals in seven midwestern states responded to a telephone survey about their appraisal of the performance of clinical dietitians. Most of the clinical managers had developed criteria with written standards for evaluating performance. Respondents evaluated the dietitians once a year and relied primarily on chart audits, other work samples, and critical incidents to judge performance. Managers in 32 of the hospitals asked their subordinates to complete a self-appraisal, and almost all of the managers negotiated with the dietitians to identify goals for professional improvement. Respondents' reasons for conducting performance appraisals were indicative of a participative management style. During the interviews, many clinical managers requested help in improving their performance appraisal systems, which suggests a need for additional training in conducting performance appraisals. The findings indicate that most clinical managers were following recommended guidelines for conducting performance appraisals.

  10. Supplier Performance Evaluation and Rating System (SPEARS)

    SciTech Connect

    Oged, M.; Warner, D.; Gurbuz, E.

    1993-03-01

    The SSCL Magnet Quality Assurance Department has implemented a Supplier Performance Evaluation and Rating System (SPEARS) to assess supplier performance throughout the development and production stages of the SSCL program. The main objectives of SPEARS are to promote teamwork and recognize performance. This paper examines the current implementation of SPEARS. MSD QA supports the development and production of SSCsuperconducting magnets while implementing the requirements of DOE Order 5700.6C. The MSD QA program is based on the concept of continuous improvement in quality and productivity. The QA program requires that procurement of items and services be controlled to assure conformance to specification. SPEARS has been implemented to meet DOE requirements and to enhance overall confidence in supplier performance. Key elements of SPEARS include supplier evaluation and selection as well as evaluation of furnished quality through source inspection, audit, and receipt inspection. These elements are described in this paper.

  11. An evaluation of recent quantitative magnetospheric magnetic field models

    NASA Technical Reports Server (NTRS)

    Walker, R. J.

    1976-01-01

    Magnetospheric field models involving dipole tilt effects are discussed, with particular reference to defined magnetopause models and boundary surface models. The models are compared with observations and with each other whenever possible. It is shown that models containing only contributions from magnetopause and tail current systems are capable of reproducing the observed quiet time field just in a qualitative way. The best quantitative agreement between models and observations take place when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. One region in which all the models fall short is the region around the polar cusp. Obtaining physically reasonable gradients should have high priority in the development of future models.

  12. A New Simple Interferometer for Obtaining Quantitatively Evaluable Flow Patterns

    NASA Technical Reports Server (NTRS)

    Erdmann, S F

    1953-01-01

    The method described in the present report makes it possible to obtain interferometer records with the aid of any one of the available Schlieren optics by the addition of very simple expedients, which fundamentally need not to be inferior to those obtained by other methods, such as the Mach-Zehnder interferometer, for example. The method is based on the fundamental concept of the phase-contrast process developed by Zernike, but which in principle has been enlarged to such an extent that it practically represents an independent interference method for general applications. Moreover, the method offers the possibility, in case of necessity, of superposing any apparent wedge field on the density field to be gauged. The theory is explained on a purely physical basis and illustrated and proved by experimental data. A number of typical cases are cited and some quantitative results reported.

  13. Quantitative comparison of PET performance-Siemens Biograph mCT and mMR.

    PubMed

    Karlberg, Anna M; Sæther, Oddbjørn; Eikenes, Live; Goa, Pål Erik

    2016-12-01

    Integrated clinical whole-body PET/MR systems were introduced in 2010. In order to bring this technology into clinical usage, it is of great importance to compare the performance with the well-established PET/CT. The aim of this study was to evaluate PET performance, with focus on image quality, on Siemens Biograph mMR (PET/MR) and Siemens Biograph mCT (PET/CT). A direct quantitative comparison of the performance characteristics between the mMR and mCT system was performed according to National Electrical Manufacturers Association (NEMA) NU 2-2007 protocol. Spatial resolution, sensitivity, count rate and image quality were evaluated. The evaluation was supplemented with additional standardized uptake value (SUV) measurements. The spatial resolution was similar for the two systems. Average sensitivity was higher for the mMR (13.3 kcps/MBq) compared to the mCT system (10.0 kcps/MBq). Peak noise equivalent count rate (NECR) was slightly higher for the mMR (196 kcps @ 24.4 kBq/mL) compared to the mCT (186 kcps @ 30.1 kBq/mL). Scatter fractions in the clinical activity concentration range yielded lower values for the mCT (34.9 %) compared to those for the mMR (37.0 %). Best image quality of the systems resulted in approximately the same mean hot sphere contrast and a difference of 19 percentage points (pp) in mean cold contrast, in favour of the mCT. In general, point spread function (PSF) increased hot contrast and time of flight (TOF) increased both hot and cold contrast. Highest hot contrast for the smallest sphere (10 mm) was achieved with the combination of TOF and PSF on the mCT. Lung residual error was higher for the mMR (22 %) than that for the mCT (17 %), with no effect of PSF. With TOF, lung residual error was reduced to 8 % (mCT). SUV was accurate for both systems, but PSF caused overestimations for the 13-, 17- and 22-mm spheres. Both systems proved good performance characteristics, and the PET image quality of the mMR was close to that of the m

  14. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  15. Quantitative pharmaco-EEG and performance after administration of brotizolam to healthy volunteers

    PubMed Central

    Saletu, B.; Grünberger, J.; Linzmayer, L.

    1983-01-01

    1 The activity of brotizolam (0.1, 0.3 and 0.5 mg) was studied in normal subjects using quantitative pharmaco-EEG, psychometric and clinical evaluation. 2 Power spectral density analysis showed no changes after placebo, while brotizolam increased beta-activity, decreased alpha-activity and increased the average frequency (anxiolytic pharmaco-EEG profile). In addition, 0.3 and 0.5 mg brotizolam augmented delta-activity indicating hypnotic activity. 3 The highest dose (0.5 mg) of brotizolam decreased attention, concentration, psychomotor performance and affectivity, and increased reaction time. The lower doses of brotizolam also caused a decrease in attention and concentration, but tended to improve psychomotor performance, shorten reaction time, and did not influence mood or affectivity. 4 Brotizolam (0.1 mg) is the minimal effective psychoactive dose with a tranquillizing effect, while 0.5 mg and to some extent 0.3 mg induce a sedative effect and may be regarded as hypnotic doses. PMID:6661379

  16. Performance Evaluation Model for Application Layer Firewalls

    PubMed Central

    Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall. PMID:27893803

  17. Performance Evaluation Model for Application Layer Firewalls.

    PubMed

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  18. A Quantitative Approach to Evaluating Training Curriculum Content Sampling Adequacy.

    ERIC Educational Resources Information Center

    Bownas, David A.; And Others

    1985-01-01

    Developed and illustrated a technique depicting the fit between training curriculum content and job performance requirements for three Coast Guard schools. Generated a listing of tasks which receive undue emphasis in training, tasks not being taught, and tasks instructors intend to train, but which course graduates are unable to perform.…

  19. A Quantitative Approach to Evaluating Training Curriculum Content Sampling Adequacy.

    ERIC Educational Resources Information Center

    Bownas, David A.; And Others

    1985-01-01

    Developed and illustrated a technique depicting the fit between training curriculum content and job performance requirements for three Coast Guard schools. Generated a listing of tasks which receive undue emphasis in training, tasks not being taught, and tasks instructors intend to train, but which course graduates are unable to perform.…

  20. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  1. Evaluation of Quantitative Environmental Stress Screening (ESS) Methods. Volume 1

    DTIC Science & Technology

    1991-11-01

    muu4 The objective of this study was to evaluate Environmental Stress Screening (ESS) techniques contained in DOD-HDBK-344,’ by applying the methodology...to several electronic products during actual factor production. Validation of the techniques , the develop- ment of improved, qi•p’lified,_ad...automated procedures and subsequent revisions to the Handbook were the objectives, qf the evaluation. The Rome Laboratory has developed techniques which

  2. Effects of Performers' External Characteristics on Performance Evaluations.

    ERIC Educational Resources Information Center

    Bermingham, Gudrun A.

    2000-01-01

    States that fairness has been a major concern in the field of music adjudication. Reviews the research literature to reveal information about three external characteristics (race, gender, and physical attractiveness) that may affect judges' performance evaluations and influence fairness of music adjudication. Includes references. (CMK)

  3. Performance Evaluation of Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Hatsutori, Y.; Kobayashi, Y.; Gouda, N.; Yano, T.; Murooka, J.; Niwa, Y.; Yamada, Y.

    We report the results of performance evaluation of the first Japanese astrometry satellite, Nano-JASMINE. It is a very small satellite and weighs only 35 kg. It aims to carry out astrometry measurement of nearby bright stars (z ≤ 7.5 mag) with an accuracy of 3 milli-arcseconds. Nano-JASMINE will be launched by Cyclone-4 rocket in August 2011 from Brazil. The current status is in the process of evaluating the performances. A series of performance tests and numerical analysis were conducted. As a result, the engineering model (EM) of the telescope was measured to be achieving a diffraction-limited performance and confirmed that it has enough performance for scientific astrometry.

  4. Performance Evaluation of Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Hatsutori, Y.; Kobayashi, Y.; Gouda, N.; Yano, T.; Murooka, J.; Niwa, Y.; Yamada, Y.

    2011-02-01

    We report the results of performance evaluation of the first Japanese astrometry satellite, Nano-JASMINE. It is a very small satellite and weighs only 35 kg. It aims to carry out astrometry measurement of nearby bright stars (z ≤ 7.5 mag) with an accuracy of 3 milli-arcseconds. Nano-JASMINE will be launched by Cyclone-4 rocket in August 2011 from Brazil. The current status is in the process of evaluating the performances. A series of performance tests and numerical analysis were conducted. As a result, the engineering model (EM) of the telescope was measured to be achieving a diffraction-limited performance and confirmed that it has enough performance for scientific astrometry.

  5. Use of the Behaviorally Anchored Rating Scale in Evaluating Teacher Performance.

    ERIC Educational Resources Information Center

    Beebe, Robert J.

    Behaviorally anchored rating scales (BARS), a new quantitative method of employee performance evaluation, is advocated for teacher evaluation. Development of a BARS consists generally of five steps: a representative sample of potential raters generates the scales; the group identifies the broad qualities to be evaluated; the group formulates…

  6. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course.

    PubMed

    Flanagan, K M; Einarson, J

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre-post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student's math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student's grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide "instructor actions" from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. © 2017 K. M. Flanagan and J. Einarson. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http

  7. Evaluating student performance in clinical dietetics.

    PubMed

    Novascone, M A

    1985-06-01

    The focus of this study was on the development and field-testing of a set of behaviorally anchored rating scales for evaluating the clinical performance of dietetic students. The scales emphasized the application of skills and knowledge. A variation of the Smith-Kendall technique was used to develop the scales. The 42 participants involved in instrument development included dietetic students, didactic and clinical instructors, and dietetic practitioners. The completed instrument contained 8 dimension statements and 70 behavioral anchors. The instrument was field-tested in 16 clinical rotations within 8 dietetic education programs. Evaluators not only rated student performance but also critiqued the format and content of the scales. The mid-to-upper portions of each scale were used most frequently, and little score variation within or across programs was noted. The scales were deemed appropriate for formative evaluation; however, some evaluators who had to grade students' performance expressed a desire for performance standards defined in terms of grades. Because the process used to develop the instrument facilitated the articulation of performance criteria, it is recommended as a practical approach to setting performance standards.

  8. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  9. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  10. Quantitative evaluation of noise reduction strategies in dual-energy imaging.

    PubMed

    Warp, Richard J; Dobbins, James T

    2003-02-01

    In this paper we describe a quantitative evaluation of the performance of three dual-energy noise reduction algorithms: Kalender's correlated noise reduction (KCNR), noise clipping (NOC), and edge-predictive adaptive smoothing (EPAS). These algorithms were compared to a simple smoothing filter approach, using the variance and noise power spectrum measurements of the residual noise in dual-energy images acquired with an a-Si TFT flat-panel x-ray detector. An estimate of the true noise was made through a new method with subpixel accuracy by subtracting an individual image from an ensemble average image. The results indicate that in the lung regions of the tissue image, all three algorithms reduced the noise by similar percentages at high spatial frequencies (KCNR=88%, NOC=88%, EPAS=84%, NOC/KCNR=88%) and somewhat less at low spatial frequencies (KCNR=45%, NOC=54%, EPAS=52%, NOC/KCNR=55%). At low frequencies, the presence of edge artifacts from KCNR made the performance worse, thus NOC or NOC combined with KCNR performed best. At high frequencies, KCNR performed best in the bone image, yet NOC performed best in the tissue image. Noise reduction strategies in dual-energy imaging can be effective and should focus on blending various algorithms depending on anatomical locations.

  11. A quantitative evaluation of confidence measures for stereo vision.

    PubMed

    Hu, Xiaoyan; Mordohai, Philippos

    2012-11-01

    We present an extensive evaluation of 17 confidence measures for stereo matching that compares the most widely used measures as well as several novel techniques proposed here. We begin by categorizing these methods according to which aspects of stereo cost estimation they take into account and then assess their strengths and weaknesses. The evaluation is conducted using a winner-take-all framework on binocular and multibaseline datasets with ground truth. It measures the capability of each confidence method to rank depth estimates according to their likelihood for being correct, to detect occluded pixels, and to generate low-error depth maps by selecting among multiple hypotheses for each pixel. Our work was motivated by the observation that such an evaluation is missing from the rapidly maturing stereo literature and that our findings would be helpful to researchers in binocular and multiview stereo.

  12. Preprocessing of Edge of Light images: towards a quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Forsyth, David S.; Marincak, Anton

    2003-08-01

    A computer vision inspection system, named Edge of Light TM (EOL), was invented and developed at the Institute for Aerospace Research of the National Research Council Canada. One application of interest is the detection and quantitative measurement of "pillowing" caused by corrosion in the faying surfaces of aircraft fuselage joints. To quantify the hidden corrosion, one approach is to relate the average corrosion of a region to the peak-to-peak amplitude between two diagonally adjacent rivet centers. This raises the requirement for automatically locating the rivet centers. The first step to achieve this is the rivet edge detection. In this study, gradient-based edge detection, local energy based feature extraction, and an adaptive threshold method were employed to identify the edge of rivets, which facilitated the first step in the EOL quantification procedure. Furthermore, the brightness profile is processed by the derivative operation, which locates the pillowing along the scanning direction. The derivative curves present an estimation of the inspected surface.

  13. Evaluation of various real-time reverse transcription quantitative PCR assays for norovirus detection.

    PubMed

    Yoo, Ju Eun; Lee, Cheonghoon; Park, SungJun; Ko, GwangPyo

    2017-02-01

    Human noroviruses are widespread and contagious viruses causing nonbacterial gastroenteritis. Real-time reverse transcription quantitative PCR (real-time RT-qPCR) is currently the gold standard for sensitive and accurate detection for these pathogens and serves as a critical tool in outbreak prevention and control. Different surveillance teams, however, may use different assays and variability in specimen conditions may lead to disagreement in results. Furthermore, the norovirus genome is highly variable and continuously evolving. These issues necessitate the re-examination of the real-time RT-qPCR's robustness in the context of accurate detection as well as the investigation of practical strategies to enhance assay performance. Four widely referenced real-time RT-qPCR assays (Assay A-D) were simultaneously performed to evaluate characteristics such as PCR efficiency, detection limit, as well as sensitivity and specificity with RT-PCR, and to assess the most accurate method for detecting norovirus genogroups I and II. Overall, Assay D was evaluated to be the most precise and accurate assay in this study. A Zen internal quencher, which decreases nonspecific fluorescence during the PCR reaction, was added to Assay D's probe which further improved assay performance. This study compared several detection assays for noroviruses and an improvement strategy based on such comparisons provided useful characterizations of a highly optimized real-time RT-qPCR assay for norovirus detection.

  14. Evaluate reformer performance at a glance

    SciTech Connect

    Nag, A.

    1996-02-01

    Catalytic reforming is becoming increasingly important in replacing octane lost as the removal of lead from worldwide gasoline pools continues. A method has been developed that can quickly evaluate the performance of any catalytic reformer. The catalytic naphtha reforming process primarily involves three well-known reactions. These are aromatization of naphthenes, cyclization of paraffins and hydrocracking of paraffins. Hydrogen is produced in the process of aromatization and dehydrocyclization of paraffins. Reformer performance is normally evaluated with a reformate analysis (PONA) and yield of C{sub 5{sup +}} reformate. This method of quick evaluation of reformer performance is based upon the main assumption that the increase in hydrocarbon moles in the process is equal to the number of C{single_bond}C bond ruptures and one mole of hydrogen is absorbed to saturate the same. This new method calculates aromatization efficiency, paraffin conversion, aromatic selectivity and finally the paraffin, naphthene and aromatic content of C{sub 5{sup +}} reformate.

  15. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  16. Evaluation of static and dynamic perfusion cardiac computed tomography for quantitation and classification tasks

    PubMed Central

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2016-01-01

    Abstract. Cardiac computed tomography (CT) acquisitions for perfusion assessment can be performed in a dynamic or static mode. Either method may be used for a variety of clinical tasks, including (1) stratifying patients into categories of ischemia and (2) using a quantitative myocardial blood flow (MBF) estimate to evaluate disease severity. In this simulation study, we compare method performance on these classification and quantification tasks for matched radiation dose levels and for different flow states, patient sizes, and injected contrast levels. Under conditions simulated, the dynamic method has low bias in MBF estimates (0 to 0.1  ml/min/g) compared to linearly interpreted static assessment (0.45 to 0.48  ml/min/g), making it more suitable for quantitative estimation. At matched radiation dose levels, receiver operating characteristic analysis demonstrated that the static method, with its high bias but generally lower variance, had superior performance (p<0.05) in stratifying patients, especially for larger patients and lower contrast doses [area under the curve (AUC)=0.95 to 96 versus 0.86]. We also demonstrate that static assessment with a correctly tuned exponential relationship between the apparent CT number and MBF has superior quantification performance to static assessment with a linear relationship and to dynamic assessment. However, tuning the exponential relationship to the patient and scan characteristics will likely prove challenging. This study demonstrates that the selection and optimization of static or dynamic acquisition modes should depend on the specific clinical task. PMID:27175377

  17. Quantitative Evaluation of Passive Muscle Stiffness in Chronic Stroke.

    PubMed

    Eby, Sarah; Zhao, Heng; Song, Pengfei; Vareberg, Barbara J; Kinnick, Randall; Greenleaf, James F; An, Kai-Nan; Chen, Shigao; Brown, Allen W

    2016-12-01

    The aim of this study was to evaluate the potential for shear wave elastography (SWE) to measure passive biceps brachii individual muscle stiffness as a musculoskeletal manifestation of chronic stroke. This was a cross-sectional study. Nine subjects with stroke were evaluated using the Fugl-Meyer and Modified Ashworth scales. Electromyography, joint torque, and SWE of the biceps brachii were obtained during passive elbow extension in subjects with stroke and four controls. Torque values at the time points corresponding to each SWE measurement during all trials were selected for direct comparison with the respective SWE stiffness using regression analysis. Intraclass correlation coefficients (ICC(1,1)) were used to evaluate the reliability of expressing alterations in material properties. Torque and passive stiffness increased with elbow extension-minimally for the controls and most pronounced in the contralateral limb of those with stroke. In the stroke group, several patterns of shear moduli and torque responses to passive elbow extension were identified, with a subset of several subjects displaying a very strong torque response coupled with minimal stiffness responses (y = 2.712x + 6.676; R = 0.181; P = 0.0310). Values of ICC(1,1) indicate consistent muscle stiffness throughout testing for the dominant side of controls, but largely inconsistent stiffness for other study conditions. SWE shows promise for enhancing evaluation of skeletal muscle after stroke. The wide variability between subjects with stroke highlights the need for precise, individualized measures.

  18. Smith Newton Vehicle Performance Evaluation (Brochure)

    SciTech Connect

    Not Available

    2012-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. Through this project, Smith Electric Vehicles will build and deploy 500 all-electric medium-duty trucks. The trucks will be deployed in diverse climates across the country.

  19. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  20. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  1. Performance Evaluation and Analysis for Gravity Matching Aided Navigation

    PubMed Central

    Wu, Lin; Wang, Hubiao; Chai, Hua; Zhang, Lu; Hsu, Houtse; Wang, Yong

    2017-01-01

    Simulation tests were accomplished in this paper to evaluate the performance of gravity matching aided navigation (GMAN). Four essential factors were focused in this study to quantitatively evaluate the performance: gravity database (DB) resolution, fitting degree of gravity measurements, number of samples in matching, and gravity changes in the matching area. Marine gravity anomaly DB derived from satellite altimetry was employed. Actual dynamic gravimetry accuracy and operating conditions were referenced to design the simulation parameters. The results verified that the improvement of DB resolution, gravimetry accuracy, number of measurement samples, or gravity changes in the matching area generally led to higher positioning accuracies, while the effects of them were different and interrelated. Moreover, three typical positioning accuracy targets of GMAN were proposed, and the conditions to achieve these targets were concluded based on the analysis of several different system requirements. Finally, various approaches were provided to improve the positioning accuracy of GMAN. PMID:28379178

  2. Skin moisturization by hydrogenated polyisobutene--quantitative and visual evaluation.

    PubMed

    Dayan, Nava; Sivalenka, Rajarajeswari; Chase, John

    2009-01-01

    Hydrogenated polyisobutene (HP) is used in topically applied cosmetic/personal care formulations as an emollient that leaves a pleasing skin feel when applied, and rubbed in after application. This effect, although distinguishable to the user, is difficult to define and quantify. Recognizing that some of the physical properties of HP such as film formation and wear resistance may contribute, in certain mechanisms, to skin moisturization, we designed a short-term pilot study to follow changes in skin moisturization. HP's incorporation into an o/w emulsion at 8% yielded increased viscosity and reduced emulsion droplet size as compared to the emollient ester CCT (capric/caprylic triglyceride) or a control formulation. Quantitative data indicate that application of the o/w emulsion formulation containing either HP or CCT significantly elevated skin moisture content and thus reduced transepidermal water loss (TEWL) by a maximal approximately 33% against the control formulation within 3 h and maintained this up to 6 h. Visual observation of skin treated with the HP-containing formulation showed fine texture and clear contrast as compared to the control or the CCT formulation, confirming this effect. As a result of increased hydration, skin conductivity, as measured in terms of corneometer values, was also elevated significantly by about tenfold as early as 20 min after HP or CCT application and was maintained throughout the test period. Throughout the test period the HP formulation was 5-10% more effective than the CCT formulation both in reduction of TEWL as well as in increased skin conductivity. Thus, compared to the emollient ester (CCT), HP showed a unique capability for long-lasting effect in retaining moisture and improving skin texture.

  3. Quantitative evaluation of the major determinants of human gait.

    PubMed

    Lin, Yi-Chung; Gfoehler, Margit; Pandy, Marcus G

    2014-04-11

    Accurate knowledge of the isolated contributions of joint movements to the three-dimensional displacement of the center of mass (COM) is fundamental for understanding the kinematics of normal walking and for improving the treatment of gait disabilities. Saunders et al. (1953) identified six kinematic mechanisms to explain the efficient progression of the whole-body COM in the sagittal, transverse, and coronal planes. These mechanisms, referred to as the major determinants of gait, were pelvic rotation, pelvic list, stance knee flexion, foot and knee mechanisms, and hip adduction. The aim of the present study was to quantitatively assess the contribution of each major gait determinant to the anteroposterior, vertical, and mediolateral displacements of the COM over one gait cycle. The contribution of each gait determinant was found by applying the concept of an 'influence coefficient', wherein the partial derivative of the COM displacement with respect to a prescribed determinant was calculated. The analysis was based on three-dimensional measurements of joint angular displacements obtained from 23 healthy young adults walking at slow, normal and fast speeds. We found that hip flexion, stance knee flexion, and ankle-foot interaction (comprised of ankle plantarflexion, toe flexion and the displacement of the center of pressure) are the major determinants of the displacements of the COM in the sagittal plane, while hip adduction and pelvic list contribute most significantly to the mediolateral displacement of the COM in the coronal plane. Pelvic rotation and pelvic list contribute little to the vertical displacement of the COM at all walking speeds. Pelvic tilt, hip rotation, subtalar inversion, and back extension, abduction and rotation make negligible contributions to the displacements of the COM in all three anatomical planes.

  4. Quantitative Evaluation of Papilledema from Stereoscopic Color Fundus Photographs

    PubMed Central

    Tang, Li; Kardon, Randy H.; Wang, Jui-Kai; Garvin, Mona K.; Lee, Kyungmoo; Abràmoff, Michael D.

    2012-01-01

    Purpose. To derive a computerized measurement of optic disc volume from digital stereoscopic fundus photographs for the purpose of diagnosing and managing papilledema. Methods. Twenty-nine pairs of stereoscopic fundus photographs and optic nerve head (ONH) centered spectral domain optical coherence tomography (SD-OCT) scans were obtained at the same visit in 15 patients with papilledema. Some patients were imaged at multiple visits in order to assess their changes. Three-dimensional shape of the ONH was estimated from stereo fundus photographs using an automated multi-scale stereo correspondence algorithm. We assessed the correlation of the stereo volume measurements with the SD-OCT volume measurements quantitatively, in terms of volume of retinal surface elevation above a reference plane and also to expert grading of papilledema from digital fundus photographs using the Frisén grading scale. Results. The volumetric measurements of retinal surface elevation estimated from stereo fundus photographs and OCT scans were positively correlated (correlation coefficient r2 = 0.60; P < 0.001) and were positively correlated with Frisén grade (Spearman correlation coefficient r = 0.59; P < 0.001). Conclusions. Retinal surface elevation among papilledema patients obtained from stereo fundus photographs compares favorably with that from OCT scans and with expert grading of papilledema severity. Stereoscopic color imaging of the ONH combined with a method of automated shape reconstruction is a low-cost alternative to SD-OCT scans that has potential for a more cost-effective diagnosis and management of papilledema in a telemedical setting. An automated three-dimensional image analysis method was validated that quantifies the retinal surface topography with an imaging modality that has lacked prior objective assessment. PMID:22661468

  5. Quantitative evaluation of material degradation by Barkhausen noise method

    SciTech Connect

    Yamaguchi, Atsunori; Maeda, Noriyoshi; Sugibayashi, Takuya

    1995-12-01

    Evaluation the life of nuclear power plant becomes inevitable to extend the plant operating period. This paper applied the magnetic method using Barkhausen noise (BHN) to detect the degradation by fatigue and thermal aging. Low alloy steel (SA 508 cl.2) was fatigued at the strain amplitudes of {+-}1% and {+-}0.4%, and duplex stainless steel (SCS14A) was heated at 400 C for a long period (thermal aging). For the degraded material by thermal aging, BHN was measured and good correlation between magnetic properties and absorption energy of the material was obtained. For fatigued material, BHNM was measured at each predetermined cycle and the effect of stress or strain of the material when it measured was evaluated, and good correlation between BHN and fatigue damage ratio was obtained.

  6. Prospective safety performance evaluation on construction sites.

    PubMed

    Wu, Xianguo; Liu, Qian; Zhang, Limao; Skibniewski, Miroslaw J; Wang, Yanhong

    2015-05-01

    This paper presents a systematic Structural Equation Modeling (SEM) based approach for Prospective Safety Performance Evaluation (PSPE) on construction sites, with causal relationships and interactions between enablers and the goals of PSPE taken into account. According to a sample of 450 valid questionnaire surveys from 30 Chinese construction enterprises, a SEM model with 26 items included for PSPE in the context of Chinese construction industry is established and then verified through the goodness-of-fit test. Three typical types of construction enterprises, namely the state-owned enterprise, private enterprise and Sino-foreign joint venture, are selected as samples to measure the level of safety performance given the enterprise scale, ownership and business strategy are different. Results provide a full understanding of safety performance practice in the construction industry, and indicate that the level of overall safety performance situation on working sites is rated at least a level of III (Fair) or above. This phenomenon can be explained that the construction industry has gradually matured with the norms, and construction enterprises should improve the level of safety performance as not to be eliminated from the government-led construction industry. The differences existing in the safety performance practice regarding different construction enterprise categories are compared and analyzed according to evaluation results. This research provides insights into cause-effect relationships among safety performance factors and goals, which, in turn, can facilitate the improvement of high safety performance in the construction industry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  8. An evaluation of protein assays for quantitative determination of drugs.

    PubMed

    Williams, Katherine M; Arthur, Sarah J; Burrell, Gillian; Kelly, Fionnuala; Phillips, Darren W; Marshall, Thomas

    2003-07-31

    We have evaluated the response of six protein assays [the biuret, Lowry, bicinchoninic acid (BCA), Coomassie Brilliant Blue (CBB), Pyrogallol Red-Molybdate (PRM), and benzethonium chloride (BEC)] to 21 pharmaceutical drugs. The drugs evaluated were analgesics (acetaminophen, aspirin, codeine, methadone, morphine and pethidine), antibiotics (amoxicillin, ampicillin, gentamicin, neomycin, penicillin G and vancomycin), antipsychotics (chlorpromazine, fluphenazine, prochlorperazine, promazine and thioridazine) and water-soluble vitamins (ascorbic acid, niacinamide, pantothenic acid and pyridoxine). The biuret, Lowry and BCA assays responded strongly to most of the drugs tested. The PRM assay gave a sensitive response to the aminoglycoside antibiotics (gentamicin and neomycin) and the antipsychotic drugs. In contrast, the CBB assay showed little response to the aminoglycosides and gave a relatively poor response with the antipsychotics. The BEC assay did not respond significantly to the drugs tested. The response of the protein assays to the drugs was further evaluated by investigating the linearity of the response and the combined response of drug plus protein. The results are discussed with reference to drug interference in protein assays and the development of new methods for the quantification of drugs in protein-free solution.

  9. Smith Newton Vehicle Performance Evaluation - Cumulative (Brochure)

    SciTech Connect

    Not Available

    2014-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. U.S. companies participating in this evaluation project received funding from the American Recovery and Reinvestment Act to cover part of the cost of purchasing these vehicles. Through this project, Smith Electric Vehicles is building and deploying 500 all-electric medium-duty trucks that will be deployed by a variety of companies in diverse climates across the country.

  10. Performance evaluations of demountable electrical connections

    NASA Astrophysics Data System (ADS)

    Niemann, R. C.; Cha, Y. S.; Hull, J. R.; Buckles, W. E.; Daugherty, M. A.

    Electrical conductors operating in cryogenic environments can require demountable connections along their lengths. The connections must have low resistance and high reliability and should allow ready assembly and disassembly. In this work, the performance of two types of connections has been evaluated. The first connection type is a clamped surface-to-surface joint. The second connection type is a screwed joint that incorporates male and female machine-thread components. The connections for copper conductors have been evaluated experimentally at 77 K. Experimental variables included thread surface treatment and assembly methods. The results of the evaluations are presented.

  11. Quantitative evaluation of scintillation camera imaging characteristics of isotopes used in liver radioembolization.

    PubMed

    Elschot, Mattijs; Nijsen, Johannes Franciscus Wilhelmus; Dam, Alida Johanna; de Jong, Hugo Wilhelmus Antonius Maria

    2011-01-01

    Scintillation camera imaging is used for treatment planning and post-treatment dosimetry in liver radioembolization (RE). In yttrium-90 (90Y) RE, scintigraphic images of technetium-99m (99mTc) are used for treatment planning, while 90Y Bremsstrahlung images are used for post-treatment dosimetry. In holmium-166 (166Ho) RE, scintigraphic images of 166Ho can be used for both treatment planning and post-treatment dosimetry. The aim of this study is to quantitatively evaluate and compare the imaging characteristics of these three isotopes, in order that imaging protocols can be optimized and RE studies with varying isotopes can be compared. Phantom experiments were performed in line with NEMA guidelines to assess the spatial resolution, sensitivity, count rate linearity, and contrast recovery of 99mTc, 90Y and 166Ho. In addition, Monte Carlo simulations were performed to obtain detailed information about the history of detected photons. The results showed that the use of a broad energy window and the high-energy collimator gave optimal combination of sensitivity, spatial resolution, and primary photon fraction for 90Y Bremsstrahlung imaging, although differences with the medium-energy collimator were small. For 166Ho, the high-energy collimator also slightly outperformed the medium-energy collimator. In comparison with 99mTc, the image quality of both 90Y and 166Ho is degraded by a lower spatial resolution, a lower sensitivity, and larger scatter and collimator penetration fractions. The quantitative evaluation of the scintillation camera characteristics presented in this study helps to optimize acquisition parameters and supports future analysis of clinical comparisons between RE studies.

  12. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  13. Quantitative Evaluation of Scintillation Camera Imaging Characteristics of Isotopes Used in Liver Radioembolization

    PubMed Central

    Elschot, Mattijs; Nijsen, Johannes Franciscus Wilhelmus; Dam, Alida Johanna; de Jong, Hugo Wilhelmus Antonius Maria

    2011-01-01

    Background Scintillation camera imaging is used for treatment planning and post-treatment dosimetry in liver radioembolization (RE). In yttrium-90 (90Y) RE, scintigraphic images of technetium-99m (99mTc) are used for treatment planning, while 90Y Bremsstrahlung images are used for post-treatment dosimetry. In holmium-166 (166Ho) RE, scintigraphic images of 166Ho can be used for both treatment planning and post-treatment dosimetry. The aim of this study is to quantitatively evaluate and compare the imaging characteristics of these three isotopes, in order that imaging protocols can be optimized and RE studies with varying isotopes can be compared. Methodology/Principal Findings Phantom experiments were performed in line with NEMA guidelines to assess the spatial resolution, sensitivity, count rate linearity, and contrast recovery of 99mTc, 90Y and 166Ho. In addition, Monte Carlo simulations were performed to obtain detailed information about the history of detected photons. The results showed that the use of a broad energy window and the high-energy collimator gave optimal combination of sensitivity, spatial resolution, and primary photon fraction for 90Y Bremsstrahlung imaging, although differences with the medium-energy collimator were small. For 166Ho, the high-energy collimator also slightly outperformed the medium-energy collimator. In comparison with 99mTc, the image quality of both 90Y and 166Ho is degraded by a lower spatial resolution, a lower sensitivity, and larger scatter and collimator penetration fractions. Conclusions/Significance The quantitative evaluation of the scintillation camera characteristics presented in this study helps to optimize acquisition parameters and supports future analysis of clinical comparisons between RE studies. PMID:22073149

  14. A new method to evaluate human-robot system performance.

    PubMed

    Rodriguez, G; Weisbin, C R

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  15. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  16. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  17. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    PubMed Central

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-01-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260

  18. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    NASA Astrophysics Data System (ADS)

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-12-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5-6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance.

  19. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  20. Evaluating Performance Portability of OpenACC

    SciTech Connect

    Sabne, Amit J; Sakdhnagool, Putt; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    Accelerator-based heterogeneous computing is gaining momentum in High Performance Computing arena. However, the increased complexity of the accelerator architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle the problem. While the abstraction endowed by OpenACC offers productivity, it raises questions on its portability. This paper evaluates the performance portability obtained by OpenACC on twelve OpenACC programs on NVIDIA CUDA, AMD GCN, and Intel MIC architectures. We study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.

  1. Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.

    PubMed

    Romeo, Elizabeth M

    2010-07-01

    The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena.

  2. Performance of an automatic quantitative ultrasound analysis of the fetal lung to predict fetal lung maturity.

    PubMed

    Palacio, Montse; Cobo, Teresa; Martínez-Terrón, Mònica; Rattá, Giuseppe A; Bonet-Carné, Elisenda; Amat-Roldán, Ivan; Gratacós, Eduard

    2012-12-01

    The objective of the study was to evaluate the performance of automatic quantitative ultrasound analysis (AQUA) texture extractor to predict fetal lung maturity tests in amniotic fluid. Singleton pregnancies (24.0-41.0 weeks) undergoing amniocentesis to assess fetal lung maturity (TDx fetal lung maturity assay [FLM]) were included. A manual-delineated box was placed in the lung area of a 4-chamber view of the fetal thorax. AQUA transformed the information into a set of descriptors. Genetic algorithms extracted the most relevant descriptors and then created and validated a model that could distinguish between mature or immature fetal lungs using TDx-FLM as a reference. Gestational age at enrollment was (mean [SD]) 32.2 (4.5) weeks. According to the TDx-FLM results, 41 samples were mature and 62 were not. The imaging biomarker based on AQUA presented a sensitivity 95.1%, specificity 85.7%, and an accuracy 90.3% to predict a mature or immature lung. Fetal lung ultrasound textures extracted by AQUA provided robust features to predict TDx-FLM results. Copyright © 2012 Mosby, Inc. All rights reserved.

  3. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  4. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-06

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model.

  5. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion of or during the construction of the designed project....

  6. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  7. Performance Evaluation of a Semantic Perception Classifier

    DTIC Science & Technology

    2013-09-01

    Performance Evaluation of a Semantic Perception Classifier by Craig Lennon, Barry Bodt, Marshal Childers, Rick Camden, Arne Suppe, Luis...Camden and Nicoleta Florea Engility Corporation Luis Navarro-Serment and Arne Suppe Carnegie Mellon University...Lennon, Barry Bodt, Marshal Childers, Rick Camden,* Arne Suppe, † Luis Navarro-Serment, † and Nicoleta Florea* 5d. PROJECT NUMBER 5e. TASK

  8. Performance evaluation of an air solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Indoor tests on signal-glazed flat-plate collector are described in report. Marhsall Space Flight Center solar simulator is used to make tests. Test included evaluations on thermal performance under various combinations of flow rate, incident flux, inlet temperature, and wind speed. Results are presented in graph/table form.

  9. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbes...

  10. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbest...

  11. EVALUATION OF CONFOCAL MICROSCOPY SYSTEM PERFORMANCE

    EPA Science Inventory

    BACKGROUND. The confocal laser scanning microscope (CLSM) has enormous potential in many biological fields. Currently there is a subjective nature in the assessment of a confocal microscope's performance by primarily evaluating the system with a specific test slide provided by ea...

  12. The Performance Evaluation of Corporate Universities

    ERIC Educational Resources Information Center

    Cappiello, Giuseppe; Pedrini, Giulio

    2017-01-01

    The aim of this paper is to illustrate the phenomenon of corporate universities from the perspective of the evaluation of their performance. Corporate universities have a hybrid nature that can be referred to both as a business unit and as a higher education institution. Having reviewed the literature on corporate universities and performance…

  13. Performance Evaluation for Non-Teaching Professionals.

    ERIC Educational Resources Information Center

    Panebianco, Anthony F.

    The program Performance Evaluation for Non-Teaching Professionals at the State University of New York Institute of Technology at Utica/Rome provides periodic assessments as required by institutional policy. The system is intended to establish a standard for judging quality of an employee's work and a rational and uniform basis for appraising…

  14. Performance evaluation of lightweight piezocomposite curved actuator

    NASA Astrophysics Data System (ADS)

    Goo, Nam Seo; Kim, Cheol; Park, Hoon C.; Yoon, Kwang J.

    2001-07-01

    A numerical method for the performance evaluation of LIPCA actuators is proposed using a finite element method. Fully-coupled formulations for piezo-electric materials are introduced and eight-node incompatible elements used. After verifying the developed code, the behavior of LIPCA actuators is investigated.

  15. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  16. Measures of Searcher Performance: A Psychometric Evaluation.

    ERIC Educational Resources Information Center

    Wildemuth, Barbara M.; And Others

    1993-01-01

    Describes a study of medical students that was conducted to evaluate measures of performance on factual searches of INQUIRER, a full-text database in microbiology. Measures relating to recall, precision, search term overlap, and efficiency are discussed; reliability and construct validity are considered; and implications for future research are…

  17. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.604...

  18. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.604...

  19. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Performance evaluation. 236.604 Section 236.604 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Performance evaluation. 436.604 Section 436.604 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604...

  1. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Performance evaluation. 236.604 Section 236.604 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Performance evaluation. 436.604 Section 436.604 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604...

  3. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Performance evaluation. 2936.604 Section 2936.604 Federal Acquisition Regulations System DEPARTMENT OF LABOR GENERAL CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604...

  4. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Performance evaluation. 236.604 Section 236.604 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Performance evaluation. 236.604 Section 236.604 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.604...

  7. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Performance evaluation. 436.604 Section 436.604 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604...

  8. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.604...

  9. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Performance evaluation. 436.604 Section 436.604 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604...

  10. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Performance evaluation. 2936.604 Section 2936.604 Federal Acquisition Regulations System DEPARTMENT OF LABOR GENERAL CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604...

  11. EVALUATION OF CONFOCAL MICROSCOPY SYSTEM PERFORMANCE

    EPA Science Inventory

    BACKGROUND. The confocal laser scanning microscope (CLSM) has enormous potential in many biological fields. Currently there is a subjective nature in the assessment of a confocal microscope's performance by primarily evaluating the system with a specific test slide provided by ea...

  12. Autonomous Road Driving Arenas for Performance Evaluation

    DTIC Science & Technology

    2004-08-01

    The development of performance metrics is critical in the evaluation and advancement of intelligent systems . Obtaining the pinnacle of intelligence...in autonomous vehicles requires evolutionary standards and community support. In order to analyze and compare competing implementations of intelligent ... systems , the critical components of the system must be decoupled to facilitate repeatable trials that target specific aspects of the system’s overall

  13. Optical Storage Performance Modeling and Evaluation.

    ERIC Educational Resources Information Center

    Behera, Bailochan; Singh, Harpreet

    1990-01-01

    Evaluates different types of storage media for long-term archival storage of large amounts of data. Existing storage media are reviewed, including optical disks, optical tape, magnetic storage, and microfilm; three models are proposed based on document storage requirements; performance analysis is considered; and cost effectiveness is discussed.…

  14. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  15. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  16. A statistical model for assessing performance standards for quantitative and semiquantitative disinfectant test methods.

    PubMed

    Parker, Albert E; Hamilton, Martin A; Tomasino, Stephen F

    2014-01-01

    A performance standard for a disinfectant test method can be evaluated by quantifying the (Type I) pass-error rate for ineffective products and the (Type II) fail-error rate for highly effective products. This paper shows how to calculate these error rates for test methods where the log reduction in a microbial population is used as a measure of antimicrobial efficacy. The calculations can be used to assess performance standards that may require multiple tests of multiple microbes at multiple laboratories. Notably, the error rates account for among-laboratory variance of the log reductions estimated from a multilaboratory data set and the correlation among tests of different microbes conducted in the same laboratory. Performance standards that require that a disinfectant product pass all tests or multiple tests on average, are considered. The proposed statistical methodology is flexible and allows for a different acceptable outcome for each microbe tested, since, for example, variability may be different for different microbes. The approach can also be applied to semiquantitative methods for which product efficacy is reported as the number of positive carriers out of a treated set and the density of the microbes on control carriers is quantified, thereby allowing a log reduction to be calculated. Therefore, using the approach described in this paper, the error rates can also be calculated for semiquantitative method performance standards specified solely in terms of the maximum allowable number of positive carriers per test. The calculations are demonstrated in a case study of the current performance standard for the semiquantitative AOAC Use-Dilution Methods for Pseudomonas aeruginosa (964.02) and Staphylococcus aureus (955.15), which allow up to one positive carrier out of a set of 60 inoculated and treated carriers in each test. A simulation study was also conducted to verify the validity of the model's assumptions and accuracy. Our approach, easily implemented

  17. Performance Evaluation of Dense Gas Dispersion Models.

    NASA Astrophysics Data System (ADS)

    Touma, Jawad S.; Cox, William M.; Thistle, Harold; Zapert, James G.

    1995-03-01

    This paper summarizes the results of a study to evaluate the performance of seven dense gas dispersion models using data from three field experiments. Two models (DEGADIS and SLAB) are in the public domain and the other five (AIRTOX, CHARM, FOCUS, SAFEMODE, and TRACE) are proprietary. The field data used are the Desert Tortoise pressurized ammonia releases, Burro liquefied natural gas spill tests, and the Goldfish anhydrous hydrofluoric acid spill experiments. Desert Tortoise and Goldfish releases were simulated as horizontal jet releases, and Burro as a liquid pool. Performance statistics were used to compare maximum observed concentrations and plume half-width to those predicted by each model. Model performance varied and no model exhibited consistently good performance across all three databases. However, when combined across the three databases, all models performed within a factor of 2. Problems encountered are discussed in order to help future investigators.

  18. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results.

  19. Performance evaluation of two personal bioaerosol samplers.

    PubMed

    Tolchinsky, Alexander D; Sigaev, Vladimir I; Varfolomeev, Alexander N; Uspenskaya, Svetlana N; Cheng, Yung S; Su, Wei-Chung

    2011-01-01

    In this study, the performance of two newly developed personal bioaerosol samplers for monitoring the level of environmental and occupational airborne microorganisms was evaluated. These new personal bioaerosol samplers were designed based on a swirling cyclone with recirculating liquid film. The performance evaluation included collection efficiency tests using inert aerosols, the bioaerosol survival test using viable airborne microorganism, and the evaluation of using non-aqueous collection liquid for long-period sampling. The test results showed that these two newly developed personal bioaerosol samplers are capable of doing high efficiency, aerosol sampling (the cutoff diameters are around 0.7 μm for both samplers), and have proven to provide acceptable survival for the collected bioaerosols. By using an appropriate non-aqueous collection liquid, these two personal bioaerosol samplers should be able to permit continuous, long-period bioaerosol sampling with considerable viability for the captured bioaerosols.

  20. DAPAR & ProStaR: software to perform statistical analyses in quantitative discovery proteomics.

    PubMed

    Wieczorek, Samuel; Combes, Florence; Lazar, Cosmin; Giai Gianetto, Quentin; Gatto, Laurent; Dorffer, Alexia; Hesse, Anne-Marie; Couté, Yohann; Ferro, Myriam; Bruley, Christophe; Burger, Thomas

    2017-01-01

    DAPAR and ProStaR are software tools to perform the statistical analysis of label-free XIC-based quantitative discovery proteomics experiments. DAPAR contains procedures to filter, normalize, impute missing value, aggregate peptide intensities, perform null hypothesis significance tests and select the most likely differentially abundant proteins with a corresponding false discovery rate. ProStaR is a graphical user interface that allows friendly access to the DAPAR functionalities through a web browser.

  1. Quantitative evaluation of the clinical effects of S-adenosylmethionine on mood and behavior in Lesch-Nyhan patients.

    PubMed

    Dolcetta, Diego; Parmigiani, Pietro; Salmaso, Luigi; Bernardelle, Roberta; Cesari, Ugo; Andrighetto, Gilberto; Baschirotto, Giuseppe; Nyhan, William L; Hladnik, Uros

    2013-01-01

    BACKGROUND, RATIONALE, AND METHODS: Lesch-Nyhan disease is a rare, X-linked disorder due to hypoxanthine phosphoribosyltransferase deficiency. To evaluate reported benefit on mood and behavior obtained by the administration of S-adenosyl-L-methionine in this condition, we developed 2 quantitative evaluation tools, and used them to assess the effects of the drug in our population: the weekly questionnaire and the resistance to self-injurious behavior test. We performed an open-label, dose-escalation trial of the drug on 14 patients. Four patients tolerated the drug and reported beneficial effects. The majority experienced worsened behavior. The 2 assessment tools demonstrated effectiveness in quantitatively evaluating the self-injurious behavior.

  2. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  3. Quantitative evaluation of mechanosensing of cells on dynamically tunable hydrogels.

    PubMed

    Yoshikawa, Hiroshi Y; Rossetti, Fernanda F; Kaufmann, Stefan; Kaindl, Thomas; Madsen, Jeppe; Engel, Ulrike; Lewis, Andrew L; Armes, Steven P; Tanaka, Motomu

    2011-02-09

    Thin hydrogel films based on an ABA triblock copolymer gelator [where A is pH-sensitive poly(2-(diisopropylamino)ethyl methacrylate) (PDPA) and B is biocompatible poly(2-(methacryloyloxy)ethyl phosphorylcholine) (PMPC)] were used as a stimulus-responsive substrate that allows fine adjustment of the mechanical environment experienced by mouse myoblast cells. The hydrogel film elasticity could be reversibly modulated by a factor of 40 via careful pH adjustment without adversely affecting cell viability. Myoblast cells exhibited pronounced stress fiber formation and flattening on increasing the hydrogel elasticity. As a new tool to evaluate the strength of cell adhesion, we combined a picosecond laser with an inverted microscope and utilized the strong shock wave created by the laser pulse to determine the critical pressure required for cell detachment. Furthermore, we demonstrate that an abrupt jump in the hydrogel elasticity can be utilized to monitor how cells adapt their morphology to changes in their mechanical environment.

  4. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  5. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  6. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  7. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  8. Quantitative evaluation of stone fragments in extracorporeal shock wave lithotripsy using a time reversal operator

    NASA Astrophysics Data System (ADS)

    Wang, Jen-Chieh; Zhou, Yufeng

    2017-03-01

    Extracorporeal shock wave lithotripsy (ESWL) has been used widely in the noninvasive treatment of kidney calculi. The fine fragments less than 2 mm in size can be discharged by urination, which determines the success of ESWL. Although ultrasonic and fluorescent imaging are used to localize the calculi, it's challenging to monitor the stone comminution progress, especially at the late stage of ESWL when fragments spread out as a cloud. The lack of real-time and quantitative evaluation makes this procedure semi-blind, resulting in either under- or over-treatment after the legal number of pulses required by FDA. The time reversal operator (TRO) method has the ability to detect point-like scatterers, and the number of non-zero eigenvalues of TRO is equal to that of the scatterers. In this study, the validation of TRO method to identify stones was illustrated from both numerical and experimental results for one to two stones with various sizes and locations. Furthermore, the parameters affecting the performance of TRO method has also been investigated. Overall, TRO method is effective in identifying the fragments in a stone cluster in real-time. Further development of a detection system and evaluation of its performance both in vitro and in vivo during ESWL is necessary for application.

  9. Quantitative evaluation of fatty liver by computed tomography in rabbits

    SciTech Connect

    Kawata, R.; Sakata, K.; Kunieda, T.; Saji, S.; Doi, H.; Nozawa, Y.

    1984-04-01

    Biochemical, histologic, and computed tomographic (CT) examinations of the liver were performed in 32 rabbits in which fatty liver was induced by prolonged intravenous fat infusion. In two groups of rabbits, in which 2 and 4 g/kg/day of fat emulsion was administered, respectively, posttreatment reduction in CT value of mild degree was observed. In the group that received 8 g/kg/day of fat emulsion, posttreatment change in CT value was sufficient for a diagnosis of fatty liver of moderate degree. Reduction in CT value in fatty liver might be due largely to accumulation of triglyceride and cholesterol in the liver cells. Significant correlation was found between changes in CT value of the liver and degrees of histological fat accumulation in the liver cells. Consecutive measurement of CT values of the liver during prolonged intravenous hyperalimentation is a nonagressive method of diagnosing fatty liver.

  10. PEAPOL (Program Evaluation at the Performance Objective Level) Outside Evaluation.

    ERIC Educational Resources Information Center

    Auvil, Mary S.

    In evaluating this pilot project, which developed a computer system for assessing student progress and cost effectiveness as related to achievement of performance objectives, interviews were conducted with project participants, including project staff, school administrators, and the auto shop instructors. Project documents were reviewed and a…

  11. An identity verifier evaluation of performance

    SciTech Connect

    Maxwell, R.L.

    1987-01-01

    Because the development of personnel identity verifiers is active in several areas, it is important that an independent comparative evaluation of such devices be continuously pursued for the security industry to apply such devices. An evaluation of several verifiers was recently conducted (in the winter of 1986/1987) at Sandia National Laboratories. In a nonrigorous attempt to comparatively evaluate these verifiers in a field security environment, about 80 individuals were enrolled in five different verifiers. The enrollees were than encouraged to attempt a verification on each device several times a day for about four months such that both single try and multiple try information could be extracted from the data. Results indicated a general improvement in verifier performance with regard to accuracy and operating time compared to previous similar evaluations of verifiers at Sandia.

  12. Quantitative evaluation of the tonic vibration reflex (TVR) in the masseter muscle.

    PubMed

    Takata, Y; Nakajima, T; Yamada, Y

    1996-11-01

    This study evaluated the efficacy of the tonic vibration reflex (TVR) elicited by high-frequency vibration in evaluating masticatory muscle excitability. The experiment was performed on 16 male adult volunteers, 20 to 45 years of age, without spontaneous pain or tenderness in the masticatory muscles. The subjects were seated in a chair in a fixed head position with the mouth kept open with a bite block. TVR was elicited by vibratory stimulation applied to the mandible (approximately 15 m/s2, 160 Hz). An electromyogram (EMG) was recorded bilaterally from the masseter muscles and analyzed quantitatively using an arbitrary index (TVR index) calculated from the response. Bite force was measured during clenching using a pressure-sensitive foil. Wide variations in the TVR index (maximum, 22.7%; minimum, 0.9%, average, 7.7%) were observed among individuals. The mean index for five subjects with a clenching habit was significantly higher than that for 11 subjects without a history of clenching. Tolperisone HCl (100 mg taken orally), a gamma-drive depressant, was found to reduce the response for 2 hours. There was a negative correlation (r = -.504, P < .05) between bite force and TVR index when the values on both sides were compared. The TVR may be of use in evaluating masseter muscle excitability.

  13. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  14. Quantitative evaluation of MPTP-treated nonhuman parkinsonian primates in the HALLWAY task.

    PubMed

    Campos-Romo, Aurelio; Ojeda-Flores, Rafael; Moreno-Briseño, Pablo; Fernandez-Ruiz, Juan

    2009-03-15

    Parkinson's disease (PD) is a progressive neurodegenerative disorder. An experimental model of this disease is produced in nonhuman primates by the administration of the neurotoxin 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). In this work, we put forward a new quantitative evaluation method that uses video recordings to measure the displacement, gate, gross and fine motor performance of freely moving subjects. Four Vervet monkeys (Cercopithecus aethiops) were trained in a behavioral observation hallway while being recorded with digital video cameras from four different angles. After MPTP intoxication the animals were tested without any drug and after 30 and 90 min of Levodopa/Carbidopa administration. Using a personal computer the following behaviors were measured and evaluated from the video recordings: displacement time across the hallway, reaching time towards rewards, ingestion time, number of attempts to obtain rewards, number of rewards obtained, and level of the highest shelf reached for rewards. Our results show that there was an overall behavioral deterioration after MPTP administration and an overall improvement after Levodopa/Carbidopa treatment. This demonstrates that the HALLWAY task is a sensitive and objective method that allows detailed behavioral evaluation of freely moving monkeys in the MPTP Parkinson's disease model.

  15. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  16. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography.

    PubMed

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures.

  17. Simulation based quantitative evaluation for display uniformity in a directional backlight auto-stereoscopic display

    NASA Astrophysics Data System (ADS)

    He, Jieyong; Liang, Haowen; Zhang, Quanquan; Feng, Shirui; Wang, Jiahui; Zhou, Jianying

    2016-03-01

    In this article, we propose a quantitative evaluation for the display uniformity in a directional backlight system. Display uniformity is divided into two research aspects - static uniformity and motional uniformity. Factors influencing uniformity deterioration are then discussed in our evaluation. Furthermore, a visualized simulation based on ray-tracing model is proposed to analyze this display uniformity in quantitative depth. Optical distribution on the screen is obtained in this simulation to provide visualized results compared with the experimental results. Our work helps to fill the vacancy for the evaluation of display uniformity on directional backlight type 3D display.

  18. Behavioral patterns of environmental performance evaluation programs.

    PubMed

    Li, Wanxin; Mauerhofer, Volker

    2016-11-01

    During the past decades numerous environmental performance evaluation programs have been developed and implemented on different geographic scales. This paper develops a taxonomy of environmental management behavioral patterns in order to provide a practical comparison tool for environmental performance evaluation programs. Ten such programs purposively selected are mapped against the identified four behavioral patterns in the form of diagnosis, negotiation, learning, and socialization and learning. Overall, we found that schemes which serve to diagnose environmental abnormalities are mainly externally imposed and have been developed as a result of technical debates concerning data sources, methodology and ranking criteria. Learning oriented scheme is featured by processes through which free exchange of ideas, mutual and adaptive learning can occur. Scheme developed by higher authority for influencing behaviors of lower levels of government has been adopted by the evaluated to signal their excellent environmental performance. The socializing and learning classified evaluation schemes have incorporated dialogue, participation, and capacity building in program design. In conclusion we consider the 'fitness for purpose' of the various schemes, the merits of our analytical model and the future possibilities of fostering capacity building in the realm of wicked environmental challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  20. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  1. Performance evaluation of fingerprint verification systems.

    PubMed

    Cappelli, Raffaele; Maio, Dario; Maltoni, Davide; Wayman, James L; Jain, Anil K

    2006-01-01

    This paper is concerned with the performance evaluation of fingerprint verification systems. After an initial classification of biometric testing initiatives, we explore both the theoretical and practical issues related to performance evaluation by presenting the outcome of the recent Fingerprint Verification Competition (FVC2004). FVC2004 was organized by the authors of this work for the purpose of assessing the state-of-the-art in this challenging pattern recognition application and making available a new common benchmark for an unambiguous comparison of fingerprint-based biometric systems. FVC2004 is an independent, strongly supervised evaluation performed at the evaluators' site on evaluators' hardware. This allowed the test to be completely controlled and the computation times of different algorithms to be fairly compared. The experience and feedback received from previous, similar competitions (FVC2000 and FVC2002) allowed us to improve the organization and methodology of FVC2004 and to capture the attention of a significantly higher number of academic and commercial organizations (67 algorithms were submitted for FVC2004). A new, "Light" competition category was included to estimate the loss of matching performance caused by imposing computational constraints. This paper discusses data collection and testing protocols, and includes a detailed analysis of the results. We introduce a simple but effective method for comparing algorithms at the score level, allowing us to isolate difficult cases (images) and to study error correlations and algorithm "fusion." The huge amount of information obtained, including a structured classification of the submitted algorithms on the basis of their features, makes it possible to better understand how current fingerprint recognition systems work and to delineate useful research directions for the future.

  2. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  3. Evaluating Melanoma Drug Response and Therapeutic Escape with Quantitative Proteomics*

    PubMed Central

    Rebecca, Vito W.; Wood, Elizabeth; Fedorenko, Inna V.; Paraiso, Kim H. T.; Haarberg, H. Eirik; Chen, Yi; Xiang, Yun; Sarnaik, Amod; Gibney, Geoffrey T.; Sondak, Vernon K.; Koomen, John M.; Smalley, Keiran S. M.

    2014-01-01

    The evolution of cancer therapy into complex regimens with multiple drugs requires novel approaches for the development and evaluation of companion biomarkers. Liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) is a versatile platform for biomarker measurement. In this study, we describe the development and use of the LC-MRM platform to study the adaptive signaling responses of melanoma cells to inhibitors of HSP90 (XL888) and MEK (AZD6244). XL888 had good anti-tumor activity against NRAS mutant melanoma cell lines as well as BRAF mutant cells with acquired resistance to BRAF inhibitors both in vitro and in vivo. LC-MRM analysis showed HSP90 inhibition to be associated with decreased expression of multiple receptor tyrosine kinases, modules in the PI3K/AKT/mammalian target of rapamycin pathway, and the MAPK/CDK4 signaling axis in NRAS mutant melanoma cell lines and the inhibition of PI3K/AKT signaling in BRAF mutant melanoma xenografts with acquired vemurafenib resistance. The LC-MRM approach targeting more than 80 cancer signaling proteins was highly sensitive and could be applied to fine needle aspirates from xenografts and clinical melanoma specimens (using 50 μg of total protein). We further showed MEK inhibition to be associated with signaling through the NFκB and WNT signaling pathways, as well as increased receptor tyrosine kinase expression and activation. Validation studies identified PDGF receptor β signaling as a potential escape mechanism from MEK inhibition, which could be overcome through combined use of AZD6244 and the PDGF receptor inhibitor, crenolanib. Together, our studies show LC-MRM to have unique value as a platform for the systems level understanding of the molecular mechanisms of drug response and therapeutic escape. This work provides the proof-of-principle for the future development of LC-MRM assays for monitoring drug responses in the clinic. PMID:24760959

  4. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  5. Cavitation performance evaluation for a condensate pump

    NASA Astrophysics Data System (ADS)

    Yu, A.; Yu, W. P.; Pan, Z. B.; Luo, X. W.; Ji, B.; Y Xu, H.

    2013-12-01

    Cavitation in a condensate pump with specific speed of 95 m·m3s-1·min-1 was treated in this study. Cavitation performance for the pump was tested experimentally, and the steady state cavitating flows in the pump impeller were simulated by RANS method as well as a homogeneous cavitation model. It is noted that cavitating flow simulation reasonably depicted cavitation development in the pump. Compared to the tested results, the numerical simulation basically predicted later performance drops due to cavitation. Unfortunately, the cavitation simulation at the operation condition of 50% best efficiency point could not predict the head drop up to 3%. By applying the concept of relative cavity length cavitation performance evaluation is achieved. For better application, future study is necessary to establish the relation between relative cavity length and performance drop.

  6. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  7. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages.

    PubMed

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2013-12-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns.

  8. [E-test quantitative determination for evaluating Neisseria gonorrhoeae resistance].

    PubMed

    Filipiuc, Silvia; Iancu, Luminiţa Smaranda

    2010-01-01

    Thanks of underreported and of difficulties of isolation, antibiotic susceptibility profile of N. gonorrhoeae strains circulating is not sufficiently known in our country as well in the Suceava county. In addition, WHO' experts recommended the establishment of MIC (minimum inhibitory concentration) values using E-test strips, completing the database at the European level. To determine the type of resistance of N. gonorrhoeae strains by E-test in patients with gonorrhoea in Suceava county, in the 2009 -2010 period. We tested the sensitivity of 32 strains of N. gonorrhoeae isolated using classical algorithm and E-test strips according with CLSI 2008 (Clinical and Laboratory Standard Institute) standard. We tested the sensitivity for penicillin, amoxicillin, augmentin, clarytromycin, tetracycline, ceftriaxone, ciprofloxacin, and spectinomycin. Production of beta-lactamases was performed using API-NH test (Neisseria-Haemophylus-Biomerieux). 96.9% strains were sensitive for ceftriaxone and spectinomycin, each 10 strains (31.2%) were resistant for penicillin and tetracycline, 34.5% strains were sensitive for amoxicillin, 37.5% sensitive for ciprofloxacin, and 13/32 strains (40.6%) were sensitive for augmentine. 7 strains were beta-lactamases positive and sensitive to all antibiotics, excepting penicillin and tetracyclin. Our results, especially the low rate of sensitivity for penicillin and tetracycline (68.8%) were similar with other from Asia, America or Africa, including Iaşi region. Our results demonstrated for first time in the studied aria, using E-test strips, the level of resistance of N. gonorrhoeae offering useful informations for clinicians in order to treat the patients with ceftrixone and spectinomycine as empirical treatment, and for other antibiotics, according with antibiogram results.

  9. A quantitative evaluation of dry-sensor electroencephalography

    NASA Astrophysics Data System (ADS)

    Uy, E. Timothy

    Neurologists, neuroscientists, and experimental psychologists study electrical activity within the brain by recording voltage fluctuations at the scalp. This is electroencephalography (EEG). In conventional or "wet" EEG, scalp abrasion and use of electrolytic paste are required to insure good electrical connection between sensor and skin. Repeated abrasion quickly becomes irritating to subjects, severely limiting the number and frequency of sessions. Several groups have produced "dry" EEG sensors that do not require abrasion or conductive paste. These, in addition to sidestepping the issue of abrasion, promise to reduce setup time from about 30 minutes with a technician to less than 30 seconds without one. The availability of such an instrument would (1) reduce the cost of brain-related medical care, (2) lower the barrier of entry on brain experimentation, and (3) allow individual subjects to contribute substantially more data without fear of abrasion or fatigue. Accuracy of the EEG is paramount in the medical diagnosis of epilepsy, in experimental psychology and in the burgeoning field of brain-computer interface. Without a sufficiently accurate measurement, the advantages of dry sensors remain a moot point. However, even after nearly a decade, demonstrations of dry EEG accuracy with respect to wet have been limited to visual comparison of short snippets of spontaneous EEG, averaged event-related potentials or plots of power spectrum. In this dissertation, I propose a detailed methodology based on single-trial EEG classification for comparing dry EEG sensors to their wet counterparts. Applied to a set of commercially fabricated dry sensors, this work reveals that dry sensors can perform as well their wet counterparts with careful screening and attention to the bandwidth of interest.

  10. Multi-laboratory assessment of reproducibility, qualitative and quantitative performance of SWATH-mass spectrometry.

    PubMed

    Collins, Ben C; Hunter, Christie L; Liu, Yansheng; Schilling, Birgit; Rosenberger, George; Bader, Samuel L; Chan, Daniel W; Gibson, Bradford W; Gingras, Anne-Claude; Held, Jason M; Hirayama-Kurogi, Mio; Hou, Guixue; Krisp, Christoph; Larsen, Brett; Lin, Liang; Liu, Siqi; Molloy, Mark P; Moritz, Robert L; Ohtsuki, Sumio; Schlapbach, Ralph; Selevsek, Nathalie; Thomas, Stefani N; Tzeng, Shin-Cheng; Zhang, Hui; Aebersold, Ruedi

    2017-08-21

    Quantitative proteomics employing mass spectrometry is an indispensable tool in life science research. Targeted proteomics has emerged as a powerful approach for reproducible quantification but is limited in the number of proteins quantified. SWATH-mass spectrometry consists of data-independent acquisition and a targeted data analysis strategy that aims to maintain the favorable quantitative characteristics (accuracy, sensitivity, and selectivity) of targeted proteomics at large scale. While previous SWATH-mass spectrometry studies have shown high intra-lab reproducibility, this has not been evaluated between labs. In this multi-laboratory evaluation study including 11 sites worldwide, we demonstrate that using SWATH-mass spectrometry data acquisition we can consistently detect and reproducibly quantify >4000 proteins from HEK293 cells. Using synthetic peptide dilution series, we show that the sensitivity, dynamic range and reproducibility established with SWATH-mass spectrometry are uniformly achieved. This study demonstrates that the acquisition of reproducible quantitative proteomics data by multiple labs is achievable, and broadly serves to increase confidence in SWATH-mass spectrometry data acquisition as a reproducible method for large-scale protein quantification.SWATH-mass spectrometry consists of a data-independent acquisition and a targeted data analysis strategy that aims to maintain the favorable quantitative characteristics on the scale of thousands of proteins. Here, using data generated by eleven groups worldwide, the authors show that SWATH-MS is capable of generating highly reproducible data across different laboratories.

  11. Using probit regression to disclose the analytical performance of qualitative and semi-quantitative tests.

    PubMed

    Åsberg, Arne; Johnsen, Harald; Mikkelsen, Gustav; Hov, Gunhild Garmo

    2016-11-01

    The analytical performance of qualitative and semi-quantitative tests is usually studied by calculating the fraction of positive results after replicate testing of a few specimens with known concentrations of the analyte. We propose using probit regression to model the probability of positive results as a function of the analyte concentration, based on testing many specimens once with a qualitative and a quantitative test. We collected laboratory data where urine specimens had been analyzed by both a urine albumin ('protein') dipstick test (Combur-Test strips) and a quantitative test (BN ProSpec System). For each dipstick cut-off level probit regression was used to estimate the probability of positive results as a function of urine albumin concentration. We also used probit regression to estimate the standard deviation of the continuous measurement signal that lies behind the binary test response. Finally, we used probit regression to estimate the probability of reading a specific semi-quantitative dipstick result as a function of urine albumin concentration. Based on analyses of 3259 specimens, the concentration of urine albumin with a 0.5 (50%) probability of positive result was 57 mg/L at the lowest possible cut-off limit, and 246 and 750 mg/L at the next (higher) levels. The corresponding standard deviations were 29, 83, and 217 mg/L, respectively. Semi-quantitatively, the maximum probability of these three readings occurred at a u-albumin of 117, 420, and 1200 mg/L, respectively. Probit regression is a useful tool to study the analytical performance of qualitative and semi-quantitative tests.

  12. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  13. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  14. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  15. [Municipalities Stratification for Health Performance Evaluation].

    PubMed

    Calvo, Maria Cristina Marino; Lacerda, Josimari Telino de; Colussi, Claudia Flemming; Schneider, Ione Jayce Ceola; Rocha, Thiago Augusto Hernandes

    2016-01-01

    to propose and present a stratification of Brazilian municipalities into homogeneous groups for evaluation studies of health management performance. this was a methodological study, with selected indicators which classify municipalities according to conditions that influence the health management and population size; data for the year 2010 were collected from demographic and health databases; correlation tests and factor analysis were used. seven strata were identified - Large-sized; Medium-sized with favorable, regular or unfavorable influences; and Small-sized with favorable, regular or unfavorable influences -; there was a concentration of municipalities with favorable influences in strata with better purchasing power and funding, as well as a concentration of municipalities with unfavorable influences in the North and Northeast regions. the proposed classification grouped similar municipalities regarding influential factors in health management, which allowed the identification of comparable groups of municipalities, setting up a consistent alternative to performance evaluation studies.

  16. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  17. Design and Implementation of Performance Metrics for Evaluation of Assessments Data

    ERIC Educational Resources Information Center

    Ahmed, Irfan; Bhatti, Arif

    2016-01-01

    Evocative evaluation of assessment data is essential to quantify the achievements at course and program levels. The objective of this paper is to design performance metrics and respective formulas to quantitatively evaluate the achievement of set objectives and expected outcomes at the course levels for program accreditation. Even though…

  18. Performance Evaluation of Kitchen Exhaust Draft Hoods.

    DTIC Science & Technology

    1980-03-01

    1-ACI827 JOHNS - MANVILLE SALES CORP DENVER CO RESEARCH AND DEV-ETC F/e 13/1 PERFORMANCE EVALUATION OF KITCHEN EXHAUST DRAFT HOOOS. (U) MAR 80 P 8...ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT PROJECT. TASK AREA a WOPK UNIT NUMOERS Johns - Manville Sales Corper a, t Research & Development Center /0004...P. B. SHEPHERD, R. H. NEISEL JOHNS - MANVILLE SALES CORPORATION RESEARCH & DEVELOPMENT CENTER KEN-CARYL RANCH, DENVER, COLORADO 80217 MARCH 1980 FINAL

  19. Quantitative high-performance liquid chromatographic determination of delta 4-3-ketosteroids in adrenocortical extracts.

    PubMed

    Ballerini, R; Chinol, M; Ghelardoni, M

    1980-05-30

    A high-performance liquid chromatographic method is described for the determination of seven steroids in adrenocortical extracts showing a delta 4-3-ketonic conjugated system. The seven steroids (cortisol, cortisone, 11-dehydrocorticosterone, corticosterone, 11-deoxycortisol, aldosterone and 11-deoxycorticosterone) were separated with a chloroform-methanol gradient on a 5-micron silica column and with a water-acetonitrile gradient on a 10-micron RP-8 column. Effluents were monitored by UV absorption at 242 nm. Quantitative analysis was performed by comparing peak areas, which are proportional to the amounts of the individual substances (external standard method). The method is rapid, sensitive, easy to perform and reproducible.

  20. Project Performance Evaluation Using Deep Belief Networks

    NASA Astrophysics Data System (ADS)

    Nguvulu, Alick; Yamato, Shoso; Honma, Toshihisa

    A Project Assessment Indicator (PAI) Model has recently been applied to evaluate monthly project performance based on 15 project elements derived from the project management (PM) knowledge areas. While the PAI Model comprehensively evaluates project performance, it lacks objectivity and universality. It lacks objectivity because experts assign model weights intuitively based on their PM skills and experience. It lacks universality because the allocation of ceiling scores to project elements is done ad hoc based on the empirical rule without taking into account the interactions between the project elements. This study overcomes these limitations by applying a DBN approach where the model automatically assigns weights and allocates ceiling scores to the project elements based on the DBN weights which capture the interaction between the project elements. We train our DBN on 5 IT projects of 12 months duration and test it on 8 IT projects with less than 12 months duration. We completely eliminate the manual assigning of weights and compute ceiling scores of project elements based on DBN weights. Our trained DBN evaluates monthly project performance of the 8 test projects based on the 15 project elements to within a monthly relative error margin of between ±1.03 and ±3.30%.

  1. Performance evaluation and clinical applications of 3D plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel

    2015-06-01

    The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.

  2. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  3. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  4. A Monte Carlo and physical phantom evaluation of quantitative In-111 SPECT

    NASA Astrophysics Data System (ADS)

    He, Bin; Du, Yong; Song, Xiyun; Segars, W. Paul; Frey, Eric C.

    2005-09-01

    Accurate estimation of the 3D in vivo activity distribution is important for dose estimation in targeted radionuclide therapy (TRT). Although SPECT can potentially provide such estimates, SPECT without compensation for image degrading factors is not quantitatively accurate. In this work, we evaluated quantitative SPECT (QSPECT) reconstruction methods that include compensation for various physical effects. Experimental projection data were obtained using a GE VH/Hawkeye system and an RSD torso phantom. Known activities of In-111 chloride were placed in the lungs, liver, heart, background and two spherical compartments with inner diameters of 22 mm and 34 mm. The 3D NCAT phantom with organ activities based on clinically derived In-111 ibritumomab tiuxetan data was used for the Monte Carlo (MC) simulation studies. Low-noise projection data were simulated using previously validated MC simulation methods. Fifty sets of noisy projections with realistic count levels were generated. Reconstructions were performed using the OS-EM algorithm with various combinations of attenuation (A), scatter (S), geometric response (G), collimator-detector response (D) and partial volume compensation (PVC). The QSPECT images from the various combinations of compensations were evaluated in terms of the accuracy and precision of the estimates of the total activity in each organ. For experimental data, the errors in organ activities for ADS and PVC compensation were less than 6.5% except the smaller sphere (-11.9%). For the noisy simulated data, the errors in organ activity for ADS compensation were less than 5.5% except the lungs (20.9%) and blood vessels (15.2%). Errors for other combinations of compensations were significantly (A, AS) or somewhat (AGS) larger. With added PVC, the error in the organ activities improved slightly except for the lungs (11.5%) and blood vessels (3.6%) where the improvement was more substantial. The standard deviation/mean ratios were all less than 1.5%. We

  5. Quantitative evaluation of 3D dosimetry for stereotactic volumetric-modulated arc delivery using COMPASS.

    PubMed

    Vikraman, Subramani; Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2014-01-07

    The purpose of this study was to evaluate quantitatively the patient-specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric-modulated arc delivery. Twenty-five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric-modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5-20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)-calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS-calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose-volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of volume (D20

  6. Analytical performance evaluation for autonomous sensor fusion

    NASA Astrophysics Data System (ADS)

    Chang, K. C.

    2008-04-01

    A distributed data fusion system consists of a network of sensors, each capable of local processing and fusion of sensor data. There has been a great deal of work in developing distributed fusion algorithms applicable to a network centric architecture. Currently there are at least a few approaches including naive fusion, cross-correlation fusion, information graph fusion, maximum a posteriori (MAP) fusion, channel filter fusion, and covariance intersection fusion. However, in general, in a distributed system such as the ad hoc sensor networks, the communication architecture is not fixed. Each node has knowledge of only its local connectivity but not the global network topology. In those cases, the distributed fusion algorithm based on information graph type of approach may not scale due to its requirements to carry long pedigree information for decorrelation. In this paper, we focus on scalable fusion algorithms and conduct analytical performance evaluation to compare their performance. The goal is to understand the performance of those algorithms under different operating conditions. Specifically, we evaluate the performance of channel filter fusion, Chernoff fusion, Shannon Fusion, and Battachayya fusion algorithms. We also compare their results to NaÃve fusion and "optimal" centralized fusion algorithms under a specific communication pattern.

  7. Evaluation of Performance Management in State Schools: A Case of North Cyprus

    ERIC Educational Resources Information Center

    Atamturk, Hakan; Aksal, Fahriye A.; Gazi, Zehra A.; Atamturk, A. Nurdan

    2011-01-01

    The research study aims to evaluate performance management in the state secondary schools in North Cyprus. This study is significant by shedding a light on perceptions of teachers and headmasters regarding quality control of schools through performance management. In this research, quantitative research was employed, and a survey was conducted to…

  8. Evaluation of Performance Management in State Schools: A Case of North Cyprus

    ERIC Educational Resources Information Center

    Atamturk, Hakan; Aksal, Fahriye A.; Gazi, Zehra A.; Atamturk, A. Nurdan

    2011-01-01

    The research study aims to evaluate performance management in the state secondary schools in North Cyprus. This study is significant by shedding a light on perceptions of teachers and headmasters regarding quality control of schools through performance management. In this research, quantitative research was employed, and a survey was conducted to…

  9. Preclinical Performance Evaluation of Percutaneous Glucose Biosensors

    PubMed Central

    Soto, Robert J.; Schoenfisch, Mark H.

    2015-01-01

    The utility of continuous glucose monitoring devices remains limited by an obstinate foreign body response (FBR) that degrades the analytical performance of the in vivo sensor. A number of novel materials that resist or delay the FBR have been proposed as outer, tissue-contacting glucose sensor membranes as a strategy to improve sensor accuracy. Traditionally, researchers have examined the ability of a material to minimize the host response by assessing adsorbed cell morphology and tissue histology. However, these techniques do not adequately predict in vivo glucose sensor function, necessitating sensor performance evaluation in a relevant animal model prior to human testing. Herein, the effects of critical experimental parameters, including the animal model and data processing methods, on the reliability and usefulness of preclinical sensor performance data are considered. PMID:26085566

  10. Group 3: Performance evaluation and assessment

    NASA Technical Reports Server (NTRS)

    Frink, A.

    1981-01-01

    Line-oriented flight training provides a unique learning experience and an opportunity to look at aspects of performance other types of training did not provide. Areas such as crew coordination, resource management, leadership, and so forth, can be readily evaluated in such a format. While individual performance is of the utmost importance, crew performance deserves equal emphasis, therefore, these areas should be carefully observed by the instructors as an rea for discussion in the same way that individual performane is observed. To be effective, it must be accepted by the crew members, and administered by the instructors as pure training-learning through experience. To keep open minds, to benefit most from the experience, both in the doing and in the follow-on discussion, it is essential that it be entered into with a feeling of freedom, openness, and enthusiasm. Reserve or defensiveness because of concern for failure must be inhibit participation.

  11. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics has taken a center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of the system so that remedial measures may be taken in advance to avoid catastrophic events or unwanted downtimes. Validation of such predictions is an important but difficult proposition and a lack of appropriate evaluation methods renders prognostics meaningless. Evaluation methods currently used in the research community are not standardized and in many cases do not sufficiently assess key performance aspects expected out of a prognostics algorithm. In this paper we introduce several new evaluation metrics tailored for prognostics and show that they can effectively evaluate various algorithms as compared to other conventional metrics. Specifically four algorithms namely; Relevance Vector Machine (RVM), Gaussian Process Regression (GPR), Artificial Neural Network (ANN), and Polynomial Regression (PR) are compared. These algorithms vary in complexity and their ability to manage uncertainty around predicted estimates. Results show that the new metrics rank these algorithms in different manner and depending on the requirements and constraints suitable metrics may be chosen. Beyond these results, these metrics offer ideas about how metrics suitable to prognostics may be designed so that the evaluation procedure can be standardized. 1

  12. Quantitative evaluation of the disintegration of orally rapid disintegrating tablets by X-ray computed tomography.

    PubMed

    Otsuka, Makoto; Yamanaka, Azusa; Uchino, Tomohiro; Otsuka, Kuniko; Sadamoto, Kiyomi; Ohshima, Hiroyuki

    2012-01-01

    To measure the rapid disintegration of Oral Disintegrating Tablets (ODT), a new test (XCT) was developed using X-ray computing tomography (X-ray CT). Placebo ODT, rapid disintegration candy (RDC) and Gaster®-D-Tablets (GAS) were used as model samples. All these ODTs were used to measure oral disintegration time (DT) in distilled water at 37±2°C by XCT. DTs were affected by the width of mesh screens, and degree to which the tablet holder vibrated from air bubbles. An in-vivo tablet disintegration test was performed for RDC using 11 volunteers. DT by the in-vivo method was significantly longer than that using the conventional tester. The experimental conditions for XCT such as the width of the mesh screen and degree of vibration were adjusted to be consistent with human DT values. Since DTs by the XCT method were almost the same as the human data, this method was able to quantitatively evaluate the rapid disintegration of ODT under the same conditions as inside the oral cavity. The DTs of four commercially available ODTs were comparatively evaluated by the XCT method, conventional tablet disintegration test and in-vivo method.

  13. Detection and quantitation of HBV DNA in miniaturized samples: multi centre study to evaluate the performance of the COBAS ® AmpliPrep/COBAS ® TaqMan ® hepatitis B virus (HBV) test v2.0 by the use of plasma or serum specimens.

    PubMed

    Berger, Annemarie; Gohl, Peter; Stürmer, Martin; Rabenau, Holger Felix; Nauck, Markus; Doerr, Hans Wilhelm

    2010-11-01

    Laboratory analysis of blood specimens is an increasingly important tool for rapid diagnosis and control of therapy. So, miniaturization of test systems is needed, but reduced specimens might impair test quality. For rapid detection and quantitation of HBV DNA, the COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV test has proved a robust instrument in routine diagnostic services. The test system has been modified recently for application of reduced samples of blood plasma and for blood serum, too. The performance of this modified COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v2.0 (HBV v2.0 (this test is currently not available in the USA)) test was evaluated by comparison with the former COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v1.0 (HBV v1.0) test. In this study a platform correlation of both assay versions was done including 275 HBV DNA positive EDTA plasma samples. Comparable results were obtained (R(2)=0.97, mean difference -0.03 log(10)IU/ml). The verification of equivalency of the sample matrix (plasma vs. serum samples tested in HBV v2.0 in the same run) showed comparable results for all 278 samples with a R(2)=0.99 and a mean difference of 0.06 log(10)IU/ml. In conclusion, the new test version HBV v2.0 is highly specific and reproducible and quantifies accurately HBV DNA in EDTA plasma and serum samples from patients with chronic HBV infection.

  14. Dynamic quantitative echocardiographic evaluation of mitral regurgitation in the operating department.

    PubMed

    Gisbert, Alejandro; Soulière, Vicky; Denault, André Y; Bouchard, Denis; Couture, Pierre; Pellerin, Michel; Carrier, Michel; Levesque, Sylvie; Ducharme, Anique; Basmadjian, Arsène J

    2006-02-01

    Hemodynamic modifications induced by general anesthesia could lead to underestimation of mitral regurgitation (MR) severity in the operating department and potentially serious consequences. The intraoperative severity of MR was prospectively compared with the preoperative baseline evaluation using dynamic quantitative transesophageal echocardiography in 25 patients who were stable with MR 2/4 or greater undergoing coronary bypass, mitral valve operation, or both. Significant changes in the severity of MR using transesophageal echocardiographic criteria occurred after the induction of general anesthesia and with phenylephrine. Quantitative transesophageal echocardiographic evaluation of MR using effective orifice area and vena contracta, and the use of phenylephrine challenge, were useful to avoid underestimating MR severity in the operating department.

  15. A quantitative evaluation of cell migration by the phagokinetic track motility assay.

    PubMed

    Nogalski, Maciej T; Chan, Gary C T; Stevenson, Emily V; Collins-McMillen, Donna K; Yurochko, Andrew D

    2012-12-04

    Cellular motility is an important biological process for both unicellular and multicellular organisms. It is essential for movement of unicellular organisms towards a source of nutrients or away from unsuitable conditions, as well as in multicellular organisms for tissue development, immune surveillance and wound healing, just to mention a few roles(1,2,3). Deregulation of this process can lead to serious neurological, cardiovascular and immunological diseases, as well as exacerbated tumor formation and spread(4,5). Molecularly, actin polymerization and receptor recycling have been shown to play important roles in creating cellular extensions (lamellipodia), that drive the forward movement of the cell(6,7,8). However, many biological questions about cell migration remain unanswered. The central role for cellular motility in human health and disease underlines the importance of understanding the specific mechanisms involved in this process and makes accurate methods for evaluating cell motility particularly important. Microscopes are usually used to visualize the movement of cells. However, cells move rather slowly, making the quantitative measurement of cell migration a resource-consuming process requiring expensive cameras and software to create quantitative time-lapsed movies of motile cells. Therefore, the ability to perform a quantitative measurement of cell migration that is cost-effective, non-laborious, and that utilizes common laboratory equipment is a great need for many researchers. The phagokinetic track motility assay utilizes the ability of a moving cell to clear gold particles from its path to create a measurable track on a colloidal gold-coated glass coverslip(9,10). With the use of freely available software, multiple tracks can be evaluated for each treatment to accomplish statistical requirements. The assay can be utilized to assess motility of many cell types, such as cancer cells(11,12), fibroblasts(9), neutrophils(13), skeletal muscle cells(14

  16. Quantitative evaluation of the requirements for the promotion as associate professor at German Medical Faculties

    PubMed Central

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    Background: First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German Medical Faculties Material and methods: Analysis of the AP-regulations of German Medical Faculties according to a validated scoring system, which has been adapted to this study. Results: The overall scoring for the AP-requirements at 35 German Medical Faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate´s performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. Conclusion: The requirements for assistant professors to get nominated as an associate professor at German Medical Faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion. PMID:23255964

  17. Quantitative evaluation of the requirements for the promotion as associate professor at German medical faculties.

    PubMed

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.

  18. Determination of a quantitative parameter to evaluate swimming technique based on the maximal tethered swimming test.

    PubMed

    Soncin, Rafael; Mezêncio, Bruno; Ferreira, Jacielle Carolina; Rodrigues, Sara Andrade; Huebner, Rudolf; Serrão, Julio Cerca; Szmuchrowski, Leszek

    2017-06-01

    The aim of this study was to propose a new force parameter, associated with swimmers' technique and performance. Twelve swimmers performed five repetitions of 25 m sprint crawl and a tethered swimming test with maximal effort. The parameters calculated were: the mean swimming velocity for crawl sprint, the mean propulsive force of the tethered swimming test as well as an oscillation parameter calculated from force fluctuation. The oscillation parameter evaluates the force variation around the mean force during the tethered test as a measure of swimming technique. Two parameters showed significant correlations with swimming velocity: the mean force during the tethered swimming (r = 0.85) and the product of the mean force square root and the oscillation (r = 0.86). However, the intercept coefficient was significantly different from zero only for the mean force, suggesting that although the correlation coefficient of the parameters was similar, part of the mean velocity magnitude that was not associated with the mean force was associated with the product of the mean force square root and the oscillation. Thus, force fluctuation during tethered swimming can be used as a quantitative index of swimmers' technique.

  19. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  20. Quantitative evaluation of regional vegetation ecological environment quality by using remotely sensed data over Qingjiang, Hubei

    NASA Astrophysics Data System (ADS)

    Wang, Cheng; Sun, Yan; Li, Lijun; Zhang, Qiuwen

    2007-11-01

    Vegetation cover is an important component and the best indication to the region ecological environment. The paper adopts a new method of integrating remote sensing technology and composite index appraisal model based multiple linear regression for quantitatively evaluating the regional vegetation ecological environment quality(VEEQ). This method is different to the traditional ecological environment research methods. It fully utilizes the advantages of quantitatively remote sensing technology, directly extracts the key influencing factors of VEEQ, such as vegetation indices (RVI, NDVI, ARVI, TMG), humidity indices(NDMI, MI, TMW), soil and landform indices(NDSI, TMB, GRABS) as the evaluating parameters from data the Landsat 5/TM remotely sensed images, and then puts these factors mentioned above into the multiple linear regression evaluating model. Ultimately we obtain the VEEQ evaluation rank figure of the experimental field-part of Qingjiang region. The handy multiple linear regression model, is proved to be well fit the experimental field for the vegetation ecological environment evaluation research.

  1. Role of the Quantitative Imaging Biomarker Alliance in optimizing CT for the evaluation of lung cancer screen-detected nodules.

    PubMed

    Mulshine, James L; Gierada, David S; Armato, Samuel G; Avila, Rick S; Yankelevitz, David F; Kazerooni, Ella A; McNitt-Gray, Michael F; Buckler, Andrew J; Sullivan, Daniel C

    2015-04-01

    The Quantitative Imaging Biomarker Alliance (QIBA) is a multidisciplinary consortium sponsored by the RSNA to define processes that enable the implementation and advancement of quantitative imaging methods described in a QIBA profile document that outlines the process to reliably and accurately measure imaging features. A QIBA profile includes factors such as technical (product-specific) standards, user activities, and relationship to a clinically meaningful metric, such as with nodule measurement in the course of CT screening for lung cancer. In this report, the authors describe how the QIBA approach is being applied to the measurement of small pulmonary nodules such as those found during low-dose CT-based lung cancer screening. All sources of variance with imaging measurement were defined for this process. Through a process of experimentation, literature review, and assembly of expert opinion, the strongest evidence was used to define how to best implement each step in the imaging acquisition and evaluation process. This systematic approach to implementing a quantitative imaging biomarker with standardized specifications for image acquisition and postprocessing for a specific quantitative measurement of a pulmonary nodule results in consistent performance characteristics of the measurement (eg, bias and variance). Implementation of the QIBA small nodule profile may allow more efficient and effective clinical management of the diagnostic workup of individuals found to have suspicious pulmonary nodules in the course of lung cancer screening evaluation. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Quantitative evaluation of mechanical properties in tissue-engineered auricular cartilage.

    PubMed

    Nimeskern, Luc; van Osch, Gerjo J V M; Müller, Ralph; Stok, Kathryn S

    2014-02-01

    Tissue-engineering (TE) efforts for ear reconstruction often fail due to mechanical incompetency. It is therefore key for successful auricular cartilage (AUC) TE to ensure functional competency, that is, to mimic the mechanical properties of the native ear tissue. A review of past attempts to engineer AUC shows unsatisfactory functional outcomes with various cell-seeded biodegradable polymeric scaffolds in immunocompetent animal models. However, promising improvements to construct stability were reported with either mechanically reinforced scaffolds or novel two-stage implantation techniques. Nonetheless, quantitative mechanical evaluation of the constructs is usually overlooked, and such an evaluation of TE constructs alongside a benchmark of native AUC would allow real-time monitoring and improve functional outcomes of auricular TE strategies. Although quantitative mechanical evaluation techniques are readily available for cartilage, these techniques are designed to characterize the main functional components of hyaline and fibrous cartilage such as the collagen matrix or the glycosaminoglycan network, but they overlook the functional role of elastin, which is a major constituent of AUC. Hence, for monitoring AUC TE, novel evaluation techniques need to be designed. These should include a characterization of the specific composition and architecture of AUC, as well as mechanical evaluation of all functional components. Therefore, this article reviews the existing literature on AUC TE as well as cartilage mechanical evaluation and proposes recommendations for designing a mechanical evaluation protocol specific for AUC, and establishing a benchmark for native AUC to be used for quantitative evaluation of TE AUC.

  3. Performance evaluation of triangulation based range sensors.

    PubMed

    Guidi, Gabriele; Russo, Michele; Magrassi, Grazia; Bordegoni, Monica

    2010-01-01

    The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics.

  4. Performance evaluation of an automotive thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Dubitsky, Andrei O.

    Around 40% of the total fuel energy in typical internal combustion engines (ICEs) is rejected to the environment in the form of exhaust gas waste heat. Efficient recovery of this waste heat in automobiles can promise a fuel economy improvement of 5%. The thermal energy can be harvested through thermoelectric generators (TEGs) utilizing the Seebeck effect. In the present work, a versatile test bench has been designed and built in order to simulate conditions found on test vehicles. This allows experimental performance evaluation and model validation of automotive thermoelectric generators. An electrically heated exhaust gas circuit and a circulator based coolant loop enable integrated system testing of hot and cold side heat exchangers, thermoelectric modules (TEMs), and thermal interface materials at various scales. A transient thermal model of the coolant loop was created in order to design a system which can maintain constant coolant temperature under variable heat input. Additionally, as electrical heaters cannot match the transient response of an ICE, modelling was completed in order to design a relaxed exhaust flow and temperature history utilizing the system thermal lag. This profile reduced required heating power and gas flow rates by over 50%. The test bench was used to evaluate a DOE/GM initial prototype automotive TEG and validate analytical performance models. The maximum electrical power generation was found to be 54 W with a thermal conversion efficiency of 1.8%. It has been found that thermal interface management is critical for achieving maximum system performance, with novel designs being considered for further improvement.

  5. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  6. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  7. Cartilage repair surgery: outcome evaluation by using noninvasive cartilage biomarkers based on quantitative MRI techniques?

    PubMed

    Jungmann, Pia M; Baum, Thomas; Bauer, Jan S; Karampinos, Dimitrios C; Erdle, Benjamin; Link, Thomas M; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J; Woertler, Klaus; Welsch, Goetz H

    2014-01-01

    New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair.

  8. Evaluating iterative reconstruction performance in computed tomography

    SciTech Connect

    Chen, Baiyu Solomon, Justin; Ramirez Giraldo, Juan Carlos; Samei, Ehsan

    2014-12-15

    Purpose: Iterative reconstruction (IR) offers notable advantages in computed tomography (CT). However, its performance characterization is complicated by its potentially nonlinear behavior, impacting performance in terms of specific tasks. This study aimed to evaluate the performance of IR with both task-specific and task-generic strategies. Methods: The performance of IR in CT was mathematically assessed with an observer model that predicted the detection accuracy in terms of the detectability index (d′). d′ was calculated based on the properties of the image noise and resolution, the observer, and the detection task. The characterizations of image noise and resolution were extended to accommodate the nonlinearity of IR. A library of tasks was mathematically modeled at a range of sizes (radius 1–4 mm), contrast levels (10–100 HU), and edge profiles (sharp and soft). Unique d′ values were calculated for each task with respect to five radiation exposure levels (volume CT dose index, CTDI{sub vol}: 3.4–64.8 mGy) and four reconstruction algorithms (filtered backprojection reconstruction, FBP; iterative reconstruction in imaging space, IRIS; and sinogram affirmed iterative reconstruction with strengths of 3 and 5, SAFIRE3 and SAFIRE5; all provided by Siemens Healthcare, Forchheim, Germany). The d′ values were translated into the areas under the receiver operating characteristic curve (AUC) to represent human observer performance. For each task and reconstruction algorithm, a threshold dose was derived as the minimum dose required to achieve a threshold AUC of 0.9. A task-specific dose reduction potential of IR was calculated as the difference between the threshold doses for IR and FBP. A task-generic comparison was further made between IR and FBP in terms of the percent of all tasks yielding an AUC higher than the threshold. Results: IR required less dose than FBP to achieve the threshold AUC. In general, SAFIRE5 showed the most significant dose reduction

  9. Evaluating iterative reconstruction performance in computed tomography.

    PubMed

    Chen, Baiyu; Ramirez Giraldo, Juan Carlos; Solomon, Justin; Samei, Ehsan

    2014-12-01

    Iterative reconstruction (IR) offers notable advantages in computed tomography (CT). However, its performance characterization is complicated by its potentially nonlinear behavior, impacting performance in terms of specific tasks. This study aimed to evaluate the performance of IR with both task-specific and task-generic strategies. The performance of IR in CT was mathematically assessed with an observer model that predicted the detection accuracy in terms of the detectability index (d'). d' was calculated based on the properties of the image noise and resolution, the observer, and the detection task. The characterizations of image noise and resolution were extended to accommodate the nonlinearity of IR. A library of tasks was mathematically modeled at a range of sizes (radius 1-4 mm), contrast levels (10-100 HU), and edge profiles (sharp and soft). Unique d' values were calculated for each task with respect to five radiation exposure levels (volume CT dose index, CTDIvol: 3.4-64.8 mGy) and four reconstruction algorithms (filtered backprojection reconstruction, FBP; iterative reconstruction in imaging space, IRIS; and sinogram affirmed iterative reconstruction with strengths of 3 and 5, SAFIRE3 and SAFIRE5; all provided by Siemens Healthcare, Forchheim, Germany). The d' values were translated into the areas under the receiver operating characteristic curve (AUC) to represent human observer performance. For each task and reconstruction algorithm, a threshold dose was derived as the minimum dose required to achieve a threshold AUC of 0.9. A task-specific dose reduction potential of IR was calculated as the difference between the threshold doses for IR and FBP. A task-generic comparison was further made between IR and FBP in terms of the percent of all tasks yielding an AUC higher than the threshold. IR required less dose than FBP to achieve the threshold AUC. In general, SAFIRE5 showed the most significant dose reduction potentials (11-54 mGy, 77%-84%), followed by

  10. Evaluation of extracapsular extension in prostate cancer using qualitative and quantitative multiparametric MRI.

    PubMed

    Kim, Wooil; Kim, Chan Kyo; Park, Jung Jae; Kim, Minji; Kim, Jae-Hun

    2017-06-01

    To investigate the value of multiparametric magnetic resonance imaging (mpMRI) for extracapsular extension (ECE) in prostate cancer (PCa). In all, 292 patients who received radical prostatectomy and underwent preoperative mpMRI at 3T were enrolled retrospectively. For determining the associations with ECE, the likelihood of ECE was assessed qualitatively on T2 -weighted imaging (T2 WI) and combined T2 WI and diffusion-weighted imaging (DWI) or dynamic contrast-enhanced imaging (DCEI). Quantitative MRI parameters were measured in PCa based on histopathological findings. Two models for detecting ECE including imaging and clinical parameters were developed using multivariate analysis: Model 1 excluding combined T2 WI and DWI and DCEI and Model 2 excluding combined T2 WI and DWI, and combined T2 WI and DCEI. Diagnostic performance of imaging parameters and models was evaluated using the area under the receiver operating characteristics curve (Az). For detecting ECE, the specificity, accuracy, and Az of combined T2 WI and DWI or DCEI were statistically better than those of T2 WI (P < 0.05), and all quantitative MRI parameters showed a statistical difference between the patients with and without ECE (P < 0.05). On multivariate analysis, significant independent markers in Model 1 were combined T2 WI and DWI, combined T2 WI and DCEI, and K(trans) (P < 0.05). In Model 2, significant markers were combined T2 WI and DWI and DCEI, K(trans) , Kep , and Ve (P < 0.05). The Az values of models 1 and 2 were 0.944 and 0.957, respectively. mpMRI may be useful to improve diagnostic accuracy of the models for determining the associations with ECE in PCa. 4 Technical Efficacy: Stage 2 J. MAGN. RESON. IMAGING 2017;45:1760-1770. © 2016 International Society for Magnetic Resonance in Medicine.

  11. Evaluation of Fourier Transform Profilometry for Quantitative Waste Volume Determination under Simulated Hanford Tank Conditions

    SciTech Connect

    Etheridge, J.A.; Jang, P.R.; Leone, T.; Long, Z.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.; Coggins, T.L.

    2008-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We are conducting a multi-stage performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. The successive stages impose aspects that present increasing difficulty and increasingly more accurate approximations of in-tank environments. In this paper, we report our investigations of the dependence of the analyst upon FTP volume determination results and of the

  12. Quantitative Evaluation of DNA Hypermethylation in Malignant and Benign Breast Tissue and Fluids

    PubMed Central

    Zhu, Weizhu; Qin, Wenyi; Hewett, John E.; Sauter, Edward R.

    2012-01-01

    The assessment of DNA had demonstrated altered methylation in malignant compared to benign breast tissue.The purpose of our study was to 1) confirm the predictive ability of methylation assessment in breast tissue, and 2) use the genes found to be cancer predictive in tissue to evaluate the diagnostic potential of hypermethylation assessment in nipple aspirate fluid (NAF) and mammary ductoscopic (MD) samples. Quantitative methylation specific (qMS)-PCR was conducted on three specimen sets: 44 malignant (CA) and 34 normal (NL) tissue specimens, 18 matched CA, adjacent normal (ANL) tissue and NAF specimens, and 119 MD specimens. Training and validation tissue sets were analyzed to determine the optimal group of cancer predictive genes for NAF and MD analysis. NAF and MD cytologic review were also performed. Methylation of CCND-2, p16, RAR-β and RASSF-1a was significantly more prevalent in tumor than in normal tissue specimens. Receiver operating characteristic curve analysis demonstrated an area under the curve of 0.96. For the 18 matched CA, ANL and NAF specimens, the four predictive genes identified in cancer tissue contained increased methylation in CA vs. ANL tissue; NAF samples had higher methylation than ANL specimens. Methylation frequency was higher in MD specimens from breasts with cancer than benign samples for p16 and RASSF-1a. In summary, 1) routine quantitative DNA methylation assessment in NAF and MD samples is possible, and 2) genes hypermethylated in malignant breast tissue are also altered in matched NAF and in MD samples, and may be useful to assist in early breast cancer detection. PMID:19618401

  13. Quantitative performance measurements of bent crystal Laue analyzers for X-ray fluorescence spectroscopy

    PubMed Central

    Karanfil, C.; Bunker, G.; Newville, M.; Segre, C. U.; Chapman, D.

    2012-01-01

    Third-generation synchrotron radiation sources pose difficult challenges for energy-dispersive detectors for XAFS because of their count rate limitations. One solution to this problem is the bent crystal Laue analyzer (BCLA), which removes most of the undesired scatter and fluorescence before it reaches the detector, effectively eliminating detector saturation due to background. In this paper experimental measurements of BCLA performance in conjunction with a 13-element germanium detector, and a quantitative analysis of the signal-to-noise improvement of BCLAs are presented. The performance of BCLAs are compared with filters and slits. PMID:22514172

  14. Performance Evaluation of Emerging High Performance Computing Technologies using WRF

    NASA Astrophysics Data System (ADS)

    Newby, G. B.; Morton, D.

    2008-12-01

    The Arctic Region Supercomputing Center (ARSC) has evaluated multicore processors and other emerging processor technologies for a variety of high performance computing applications in the earth and space sciences, especially climate and weather applications. A flagship effort has been to assess dual core processor nodes on ARSC's Midnight supercomputer, in which two-socket systems were compared to eight-socket systems. Midnight is utilized for ARSC's twice-daily weather research and forecasting (WRF) model runs, available at weather.arsc.edu. Among other findings on Midnight, it was found that the Hypertransport system for interconnecting Opteron processors, memory, and other subsystems does not scale as well on eight-socket (sixteen processor) systems as well as two-socket (four processor) systems. A fundamental limitation is the cache snooping operation performed whenever a computational thread accesses main memory. This increases memory latency as the number of processor sockets increases. This is particularly noticeable on applications such as WRF that are primarily CPU-bound, versus applications that are bound by input/output or communication. The new Cray XT5 supercomputer at ARSC features quad core processors, and will host a variety of scaling experiments for WRF, CCSM4, and other models. Early results will be presented, including a series of WRF runs for Alaska with grid resolutions under 2km. ARSC will discuss a set of standardized test cases for the Alaska domain, similar to existing test cases for CONUS. These test cases will provide different configuration sizes and resolutions, suitable for single processors up to thousands. Beyond multi-core Opteron-based supercomputers, ARSC has examined WRF and other applications on additional emerging technologies. One such technology is the graphics processing unit, or GPU. The 9800-series nVidia GPU was evaluated with the cuBLAS software library. While in-socket GPUs might be forthcoming in the future, current

  15. Quantitative sleep EEG and polysomnographic predictors of driving simulator performance in obstructive sleep apnea.

    PubMed

    Vakulin, Andrew; D'Rozario, Angela; Kim, Jong-Won; Watson, Brooke; Cross, Nathan; Wang, David; Coeytaux, Alessandra; Bartlett, Delwyn; Wong, Keith; Grunstein, Ronald

    2016-02-01

    To improve identification of obstructive sleep apnea (OSA) patients at risk of driving impairment, this study explored predictors of driving performance impairment in untreated OSA patients using clinical PSG metrics, sleepiness questionnaires and quantitative EEG markers from routine sleep studies. Seventy-six OSA patients completed sleepiness questionnaires and driving simulator tests in the evening of their diagnostic sleep study. All sleep EEGs were subjected to quantitative power spectral analysis. Correlation and multivariate linear regression were used to identify the strongest predictors of driving simulator performance. Absolute EEG spectral power across all frequencies (0.5-32 Hz) throughout the entire sleep period and separately in REM and NREM sleep, (r range 0.239-0.473, all p<0.05), as well as sleep onset latency (r=0.273, p<0.017) positively correlated with driving simulator steering deviation. Regression models revealed that amongst clinical and qEEG variables, the significant predictors of worse steering deviation were greater total EEG power during NREM and REM sleep, greater beta EEG power in NREM and greater delta EEG power in REM (range of variance explained 5-17%, t range 2.29-4.0, all p<0.05) and sleep onset latency (range of variance explained 4-9%, t range 2.15-2.5, all p<0.05). In OSA patients, increased EEG power, especially in the faster frequency (beta) range during NREM sleep and slower frequency (delta) range in REM sleep were associated with worse driving performance, while no relationships were observed with clinical metrics e.g. apnea, arousal or oxygen indices. Quantitative EEG analysis in OSA may provide useful markers of driving impairment risk. Future studies are necessary to confirm these findings and assess the clinical significance of quantitative EEG as predictors of driving impairment in OSA. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Seismic Performance Evaluation of Concentrically Braced Frames

    NASA Astrophysics Data System (ADS)

    Hsiao, Po-Chien

    Concentrically braced frames (CBFs) are broadly used as lateral-load resisting systems in buildings throughout the US. In high seismic regions, special concentrically braced frames (SCBFs) where ductility under seismic loading is necessary. Their large elastic stiffness and strength efficiently sustains the seismic demands during smaller, more frequent earthquakes. During large, infrequent earthquakes, SCBFs exhibit highly nonlinear behavior due to brace buckling and yielding and the inelastic behavior induced by secondary deformation of the framing system. These response modes reduce the system demands relative to an elastic system without supplemental damping. In design the re reduced demands are estimated using a response modification coefficient, commonly termed the R factor. The R factor values are important to the seismic performance of a building. Procedures put forth in FEMAP695 developed to R factors through a formalized procedure with the objective of consistent level of collapse potential for all building types. The primary objective of the research was to evaluate the seismic performance of SCBFs. To achieve this goal, an improved model including a proposed gusset plate connection model for SCBFs that permits accurate simulation of inelastic deformations of the brace, gusset plate connections, beams and columns and brace fracture was developed and validated using a large number of experiments. Response history analyses were conducted using the validated model. A series of different story-height SCBF buildings were designed and evaluated. The FEMAP695 method and an alternate procedure were applied to SCBFs and NCBFs. NCBFs are designed without ductile detailing. The evaluation using P695 method shows contrary results to the alternate evaluation procedure and the current knowledge in which short-story SCBF structures are more venerable than taller counterparts and NCBFs are more vulnerable than SCBFs.

  17. Iowa Flood Center Model Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Quintero, F.; Krajewski, W. F.; Mantilla, R.; Seo, B. C.

    2016-12-01

    We evaluated the performance of a hydrologic model which produces real-time flow forecasts. The model was developed by the Iowa Flood Center (IFC) and it is implemented operationally to produce streamflow forecast for the communities of the state of Iowa in the United States. The model parameters are calibration-free. It has a parsimonious structure that reproduces the more significant processes involved in the transformation from rainfall to runoff. The operational model uses a rainfall forcing produced by IFC, derived from the combination of rainfall fields of seven NEXRAD radars. This rainfall forcing does not include bias adjustment from rain gauges, due to the non-existence of a raingage network that enable the correction in real-time. Therefore, the model was also evaluated using the bias-adjusted rainfall product Stage IV. We used six years of IFC rainfall and Stage IV to evaluate the performance of the hydrologic model and the sensitivity of the flow simulations to the model input. The model was not calibrated to any particular rainfall product. The distributed structure of the model allows obtaining results at any channel of the drainage network. We produced simulated hydrographs at about 140 locations with different sub-basin spatial scales, where USGS streamflow observations are available. We compared flow simulations to observations and obtained several metrics of error including Nash Sutcliffe efficiency, normalized root mean square error, volume error and time to peak error. We also evaluated the number of occurrences of hits and false alarms of discharge forecasts exceeding flood stage.

  18. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  19. Performance evaluation of MPEG internet video coding

    NASA Astrophysics Data System (ADS)

    Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin

    2016-09-01

    Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.

  20. Evaluation of tests of maximum kicking performance.

    PubMed

    Markovic, G; Dizdar, D; Jaric, S

    2006-06-01

    Despite the important role of kicking in various athletic activities, the reliability of tests of maximum kicking performance has not been evaluated. The aim of the present study was to assess the reproducibility of performance of standing kick, instep kick and drop kick. Male physical education students (n=77) were tested on maximum kicking performance by means of a standard Doppler radar gun. The maximal ball speed in the standing kick, instep kick and drop kick (averaged across the subjects and trials) were 19.8+/-1.9 m s(-1), 26.7+/-2.7 m s(-1) and 25.3+/-2.2 m s(-1), respectively. There were no significant differences in the tested performances among the consecutive kicking trials of each test. The intraclass correlation coefficients ranged between 0.94 and 0.96 (95% confidence intervals 0.93-0.97). The limits of agreement for maximum ball speed in all three tests ranged from 0.2+/-1.4 m(-1) to 0.3+/-1.3 m s(-1), suggesting that in 95% of repeated trials the ball speed might be from 1.2 m s(-1) less to 1.6 m s(-1) greater than the original estimate. The coefficients of variation for all kicking tests were between 2.6% and 3.3% (95% confidence intervals; 2.2-3.9%) suggesting a low intra-subject variability. Due to a high reliability, relative simplicity, and a small number of participants needed to detect worthwhile changes, the evaluated kicking tests could be highly recommended for sport specific profiling and early selection of young athletes, as well as for the assessment of training procedures and other interventions applied on individual teams of elite soccer, rugby or American football players.

  1. Performance evaluation of bound diamond ring tools

    SciTech Connect

    Piscotty, M.A.; Taylor, J.S.; Blaedel, K.L.

    1995-07-14

    LLNL is collaborating with the Center for Optics Manufacturing (COM) and the American Precision Optics Manufacturers Association (APOMA) to optimize bound diamond ring tools for the spherical generation of high quality optical surfaces. An important element of this work is establishing an experimentally-verified link between tooling properties and workpiece quality indicators such as roughness, subsurface damage and removal rate. In this paper, we report on a standardized methodology for assessing ring tool performance and its preliminary application to a set of commercially-available wheels. Our goals are to (1) assist optics manufacturers (users of the ring tools) in evaluating tools and in assessing their applicability for a given operation, and (2) provide performance feedback to wheel manufacturers to help optimize tooling for the optics industry. Our paper includes measurements of wheel performance for three 2-4 micron diamond bronze-bond wheels that were supplied by different manufacturers to nominally- identical specifications. Preliminary data suggests that the difference in performance levels among the wheels were small.

  2. Evaluating cryostat performance for naval applications

    NASA Astrophysics Data System (ADS)

    Knoll, David; Willen, Dag; Fesmire, James; Johnson, Wesley; Smith, Jonathan; Meneghelli, Barry; Demko, Jonathan; George, Daniel; Fowler, Brian; Huber, Patti

    2012-06-01

    The Navy intends to use High Temperature Superconducting Degaussing (HTSDG) coil systems on future Navy platforms. The Navy Metalworking Center (NMC) is leading a team that is addressing cryostat configuration and manufacturing issues associated with fabricating long lengths of flexible, vacuum-jacketed cryostats that meet Navy shipboard performance requirements. The project includes provisions to evaluate the reliability performance, as well as proofing of fabrication techniques. Navy cryostat performance specifications include less than 1 Wm-1 heat loss, 2 MPa working pressure, and a 25-year vacuum life. Cryostat multilayer insulation (MLI) systems developed on the project have been validated using a standardized cryogenic test facility and implemented on 5-meterlong test samples. Performance data from these test samples, which were characterized using both LN2 boiloff and flow-through measurement techniques, will be presented. NMC is working with an Integrated Project Team consisting of Naval Sea Systems Command, Naval Surface Warfare Center-Carderock Division, Southwire Company, nkt cables, Oak Ridge National Laboratory (ORNL), ASRC Aerospace, and NASA Kennedy Space Center (NASA-KSC) to complete these efforts. Approved for public release; distribution is unlimited. This material is submitted with the understanding that right of reproduction for governmental purposes is reserved for the Office of Naval Research, Arlington, Virginia 22203-1995.

  3. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  4. Analytical and clinical performance of a new molecular assay for Epstein-Barr virus DNA quantitation.

    PubMed

    Hübner, Margit; Bozic, Michael; Konrad, Petra M; Grohs, Katharina; Santner, Brigitte I; Kessler, Harald H

    2015-02-01

    Quantitation of EBV DNA has been shown to be a useful tool to identify and monitor patients with immunosuppression and high risk for EBV-associated disease. In this study, the analytical and clinical performance of the new Realquality RS-EBV Kit (AB Analitica, Padova, Italy) was investigated. The clinical performance was compared to that of the EBV R-gene (bioMerieux, Varilhes, France) assay. When the accuracy of the new assay was tested, all results except of one were found to be within ±0.5log10 unit of the expected panel results. Determination of linearity showed a quasilinear curve, the between day imprecision ranged from 18% to 88% and the within run imprecision from 16% to 53%. When 96 clinical EDTA whole blood samples were tested, 77 concordant and 19 discordant results were obtained. When the results for the 69 samples quantifiable with both assays were compared, the new assay revealed a mean 0.31log10 unit higher measurement. The new assay proved to be suitable for the detection and quantitation of EBV DNA in EDTA whole blood in the routine diagnostic laboratory. The variation between quantitative results obtained by the assays used in this study reinforces the use of calibrators traceable to the existing international WHO standard making different assays better comparable.

  5. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  6. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  7. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  8. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  9. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  10. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  11. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  12. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  13. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    USDA-ARS?s Scientific Manuscript database

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  14. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  15. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  16. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  17. Toward Web-Site Quantitative Evaluation: Defining Quality Characteristics and Attributes.

    ERIC Educational Resources Information Center

    Olsina, L; Rossi, G.

    This paper identifies World Wide Web site characteristics and attributes and groups them in a hierarchy. The primary goal is to classify the elements that might be part of a quantitative evaluation and comparison process. In order to effectively select quality characteristics, different users' needs and behaviors are considered. Following an…

  18. Integrating Qualitative Methods in a Predominantly Quantitative Evaluation: A Case Study and Some Reflections.

    ERIC Educational Resources Information Center

    Mark, Melvin M.; Feller, Irwin; Button, Scott B.

    1997-01-01

    A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…

  19. Evaluation of a quantitative phosphorus transport model for potential improvement of southern phosphorus indices

    USDA-ARS?s Scientific Manuscript database

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  20. Evaluation of stroke performance in tennis.

    PubMed

    Vergauwen, L; Spaepen, A J; Lefevre, J; Hespel, P

    1998-08-01

    In the present studies, the Leuven Tennis Performance Test (LTPT), a newly developed test procedure to measure stroke performance in match-like conditions in elite tennis players, was evaluated as to its value for research purposes. The LTPT is enacted on a regular tennis court. It consists of first and second services, and of returning balls projected by a machine to target zones indicated by a lighted sign. Neutral, defensive, and offensive tactical situations are elicited by appropriately programming the machine. Stroke quality is determined from simultaneous measurements of error rate, ball velocity, and precision of ball placement. A velocity/precision (VP) an a velocity/precision/error (VPE) index are also calculated. The validity and sensitivity of the LTPT were determined by verifying whether LTPT scores reflect minor differences in tennis ranking on the one hand and the effects of fatigue on the other hand. Compared with lower ranked players, higher ones made fewer errors (P < 0.05). In addition, stroke velocity was higher (P < 0.05), and lateral stroke precision, VP, and VPE scores were better (P < 0.05) in the latter. Furthermore, fatigue induced by a prolonged tennis load increased (P < 0.05) error rate and decreased (P < 0.05) stroke velocity and the VP and VPE indices. It is concluded that the LTPT is an accurate, reliable, and valid instrument for the evaluation of stroke quality in high-level tennis players.

  1. Improvement and quantitative performance estimation of the back support muscle suit.

    PubMed

    Muramatsu, Y; Umehara, H; Kobayashi, H

    2013-01-01

    We have been developing the wearable muscle suit for direct and physical motion supports. The use of the McKibben artificial muscle has opened the way to the introduction of "muscle suits" compact, lightweight, reliable, wearable "assist-bots" enabling manual worker to lift and carry weights. Since back pain is the most serious problem for manual worker, improvement of the back support muscle suit under the feasibility study and quantitative estimation are shown in this paper. The structure of the upper body frame, the method to attach to the body, and the axes addition were explained as for the improvement. In the experiments, we investigated quantitative performance results and efficiency of the back support muscle suit in terms of vertical lifting of heavy weights by employing integral electromyography (IEMG). The results indicated that the values of IEMG were reduced by about 40% by using the muscle suit.

  2. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  3. Universal and Quantitative Method To Evaluate Inhibitor Potency for Cysteinome Proteins Using a Nonspecific Activity-Based Protein Profiling Probe.

    PubMed

    Sameshima, Tomoya; Tanaka, Yukiya; Miyahisa, Ikuo

    2017-06-13

    Recently, there have been a limited number of new, validated targets for small-molecule drug discovery in the pharmaceutical industry. Although there are approximately 30 000 genes in the human genome, only 2% are targeted by currently approved small-molecule drugs. One reason that many targets remain neglected by drug discovery programs is the absence of biochemical assays enabling evaluation of the potency of inhibitors in a quantitative and high-throughput manner. To overcome this issue, we developed a biochemical assay to evaluate the potency of both reversible and irreversible inhibitors using a nonspecific thiol-labeling fluorescent probe. The assay can be applied to any targets with a cysteine residue in a pocket that can accommodate small-molecule ligands. By constructing a mathematical model, we showed that the potency of compounds can be quantitatively evaluated by performing an activity-based protein profiling assay. In addition, the validity of the theory was confirmed experimentally using epidermal growth factor receptor kinase as a model target. This approach provides an assay system for targets for which biochemical assays cannot be developed. Our approach can potentially not only expand the number of exploitable targets but also accelerate the lead optimization process by providing quantitative structure-activity relationship information.

  4. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  5. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  6. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  7. Flexor and extensor muscle tone evaluated using the quantitative pendulum test in stroke and parkinsonian patients.

    PubMed

    Huang, Han-Wei; Ju, Ming-Shaung; Lin, Chou-Ching K

    2016-05-01

    The aim of this study was to evaluate the flexor and extensor muscle tone of the upper limbs in patients with spasticity or rigidity and to investigate the difference in hypertonia between spasticity and rigidity. The two experimental groups consisted of stroke patients and parkinsonian patients. The control group consisted of age and sex-matched normal subjects. Quantitative upper limb pendulum tests starting from both flexed and extended joint positions were conducted. System identification with a simple linear model was performed and model parameters were derived. The differences between the three groups and two starting positions were investigated by these model parameters and tested by two-way analysis of variance. In total, 57 subjects were recruited, including 22 controls, 14 stroke patients and 21 parkinsonian patients. While stiffness coefficient showed no difference among groups, the number of swings, relaxation index and damping coefficient showed changes suggesting significant hypertonia in the two patient groups. There was no difference between these two patient groups. The test starting from the extended position constantly manifested higher muscle tone in all three groups. In conclusion, the hypertonia of parkinsonian and stroke patients could not be differentiated by the modified pendulum test; the elbow extensors showed a higher muscle tone in both control and patient groups; and hypertonia of both parkinsonian and stroke patients is velocity dependent.

  8. Development of a Novel Automated Hair Counting System for the Quantitative Evaluation of Laser Hair Removal.

    PubMed

    Lim, Hyoung-Woo; Cho, Minwoo; Lee, Dong-Hun; Koh, Wooseok; Kim, Youdan; Chung, Jin Ho; Kim, Sungwan

    2017-02-01

    We aimed to develop and validate a novel computer-assisted automated hair counting system for the quantitative evaluation of laser hair removal (LHR). We developed a computer-aided image processing system to count hairs on shaved skin and validated its performance through clinical trials. Five volunteers of Fitzpatrick skin type III-IV volunteered and were tested on both thighs. The system automatically detects hair and places a "+" sign on each hair site for every positive detection. This method allows clinicians to check whether a hair has been counted or not. We analyzed the difference in the hair counts between the proposed system (automatic) and those by human observers (manual). The hair counts from the proposed system and the manual counts were compared. The percentage error between automatic and manual counting was <5% in each subject. The data of the two groups were statistically verified with Student's independent t-test. The averages were statistically equivalent between the two groups. The proposed system showed significant time saving in terms of counting. A dependable, accurate, and fast method of counting hairs on shaved skin through a computer-aided image processing system was developed and validated. The "+" signs on the image to indicate detection allows clinicians to compare with the original image and detect any omission or redundancy. The proposed system is expected to be reliable in analyzing the results of multiple skin-related treatments, including LHR and hair transplantation. Further, it is expected to be widely applicable for use in the clinic.

  9. Hygienization by anaerobic digestion: comparison between evaluation by cultivation and quantitative real-time PCR.

    PubMed

    Lebuhn, M; Effenberger, M; Garcés, G; Gronauer, A; Wilderer, P A

    2005-01-01

    In order to assess hygienization by anaerobic digestion, a comparison between evaluation by cultivation and quantitative real-time PCR (qPCR) including optimized DNA extraction and quantification was carried out for samples from a full-scale fermenter cascade (F1, mesophilic; F2, thermophilic; F3, mesophilic). The system was highly effective in inactivating (pathogenic) viable microorganisms, except for spore-formers. Conventionally performed cultivation underestimated viable organisms particularly in F2 and F3 by a factor of at least 10 as shown by data from extended incubation times, probably due to the rise of sublethally injured (active but not cultivable) cells. Incubation should hence be extended adequately in incubation-based hygiene monitoring of stressed samples, in order to minimize contamination risks. Although results from qPCR and cultivation agreed for the equilibrated compartments, considerably higher qPCR values were obtained for the fermenters. The difference probably corresponded to DNA copies from decayed cells that had not yet been degraded by the residual microbial activity. An extrapolation from qPCR determination to the quantity of viable organisms is hence not justified for samples that had been exposed to lethal stress.

  10. Quantitative evaluation of radiation-induced changes in sperm morphology and chromatin distribution

    SciTech Connect

    Aubele, M.; Juetting, U.R.; Rodenacker, K.; Gais, P.; Burger, G.; Hacker-Klom, U. )

    1990-01-01

    Sperm head cytometry provides a useful assay for the detection of radiation-induced damage in mouse germ cells. Exposure of the gonads to radiation is known to lead to an increase of diploid and higher polyploid sperm and of sperm with head shape abnormalities. In the pilot studies reported here quantitative analysis of the total DNA content, the morphology, and the chromatin distribution of mouse sperm was performed. The goal was to evaluate the discriminative power of features derived by high resolution image cytometry in distinguishing sperm of control and irradiated mice. Our results suggest that besides the induction of the above mentioned variations in DNA content and shape of sperm head, changes of the nonhomogeneous chromatin distribution within the sperm may also be used to quantify the radiation effect on sperm cells. Whereas the chromatin distribution features show larger variations for sperm 21 days after exposure (dpr), the shape parameters seem to be more important to discriminate sperm 35 dpr. This may be explained by differentiation processes, which take place in different stages during mouse spermatogenesis.

  11. Quantitative susceptibility mapping to evaluate the early stage of Alzheimer's disease.

    PubMed

    Kim, Hyug-Gi; Park, Soonchan; Rhee, Hak Young; Lee, Kyung Mi; Ryu, Chang-Woo; Rhee, Sun Jung; Lee, Soo Yeol; Wang, Yi; Jahng, Geon-Ho

    2017-01-01

    The objective of this study was to evaluate susceptibility changes caused by iron accumulation in cognitive normal (CN) elderly, those with amnestic mild cognitive impairment (aMCI), and those with early state AD, and to compare the findings with gray matter volume (GMV) changes caused by neuronal loss. The participants included 19 elderly CN, 19 aMCI, and 19 AD subjects. The voxel-based quantitative susceptibility map (QSM) and GMV in the brain were calculated and the differences of those insides were compared among the three groups. The differences of the QSM data and GMVs among the three groups were investigated by voxel-based and region of interest (ROI)-based comparisons using a one-way analysis of covariance (ANCOVA) test with the gender and age as covariates. Finally, a receiver-operating-characteristic (ROC) curve analysis was performed. The voxel-based results showed that QSM demonstrated more areas with significant difference between the CN and AD groups compared to GMV. GMVs were decreased, but QSM values were increased in aMCI and AD groups compared with the CN group. QSM better differentiated aMCI from CN than GMV in the precuneus and allocortex regions. In the accumulation regions of iron and amyloid β, QSM can be used to differentiate between CN and aMCI groups, indicating a useful an auxiliary imaging for early diagnosis of AD.

  12. The Evaluation and Quantitation of Dihydrogen Metabolism Using Deuterium Isotope in Rats

    PubMed Central

    Hyspler, Radomir; Ticha, Alena; Schierbeek, Henk; Galkin, Alexander; Zadak, Zdenek

    2015-01-01

    Purpose Despite the significant interest in molecular hydrogen as an antioxidant in the last eight years, its quantitative metabolic parameters in vivo are still lacking, as is an appropriate method for determination of hydrogen effectivity in the mammalian organism under various conditions. Basic Procedures Intraperitoneally-applied deuterium gas was used as a metabolic tracer and deuterium enrichment was determined in the body water pool. Also, in vitro experiments were performed using bovine heart submitochondrial particles to evaluate superoxide formation in Complex I of the respiratory chain. Main Findings A significant oxidation of about 10% of the applied dose was found under physiological conditions in rats, proving its antioxidant properties. Hypoxia or endotoxin application did not exert any effect, whilst pure oxygen inhalation reduced deuterium oxidation. During in vitro experiments, a significant reduction of superoxide formation by Complex I of the respiratory chain was found under the influence of hydrogen. The possible molecular mechanisms of the beneficial effects of hydrogen are discussed, with an emphasis on the role of iron sulphur clusters in reactive oxygen species generation and on iron species-dihydrogen interaction. Principal Conclusions According to our findings, hydrogen may be an efficient, non-toxic, highly bioavailable and low-cost antioxidant supplement for patients with pathological conditions involving ROS-induced oxidative stress. PMID:26103048

  13. The Evaluation and Quantitation of Dihydrogen Metabolism Using Deuterium Isotope in Rats.

    PubMed

    Hyspler, Radomir; Ticha, Alena; Schierbeek, Henk; Galkin, Alexander; Zadak, Zdenek

    2015-01-01

    Despite the significant interest in molecular hydrogen as an antioxidant in the last eight years, its quantitative metabolic parameters in vivo are still lacking, as is an appropriate method for determination of hydrogen effectivity in the mammalian organism under various conditions. Intraperitoneally-applied deuterium gas was used as a metabolic tracer and deuterium enrichment was determined in the body water pool. Also, in vitro experiments were performed using bovine heart submitochondrial particles to evaluate superoxide formation in Complex I of the respiratory chain. A significant oxidation of about 10% of the applied dose was found under physiological conditions in rats, proving its antioxidant properties. Hypoxia or endotoxin application did not exert any effect, whilst pure oxygen inhalation reduced deuterium oxidation. During in vitro experiments, a significant reduction of superoxide formation by Complex I of the respiratory chain was found under the influence of hydrogen. The possible molecular mechanisms of the beneficial effects of hydrogen are discussed, with an emphasis on the role of iron sulphur clusters in reactive oxygen species generation and on iron species-dihydrogen interaction. According to our findings, hydrogen may be an efficient, non-toxic, highly bioavailable and low-cost antioxidant supplement for patients with pathological conditions involving ROS-induced oxidative stress.

  14. Efficacy of fluoride varnishes for preventing enamel demineralization after interproximal enamel reduction. Qualitative and quantitative evaluation

    PubMed Central

    González Paz, Belén Manuela; García López, José

    2017-01-01

    Objectives To evaluate quantitatively and qualitatively the changes produced to enamel after interproximal reduction and subjected to demineralization cycles, after applying a fluoride varnish (Profluorid) and a fluoride varnish containing tricalcium phosphate modified by fumaric acid (Clinpro White). Materials and methods 138 interproximal dental surfaces were divided into six groups: 1) Intact enamel; 2) Intact enamel + demineralization cycles (DC); 3) Interproximal Reduction (IR); 4) IR + DC; 5) IR + Profluorid + DC; 6) IR + Clinpro White + DC. IR was performed with a 0.5 mm cylindrical diamond bur. The weight percentage of calcium (Ca), phosphorous (P) and fluoride (F) were quantified by energy-dispersive X-ray spectrometry (EDX). Samples were examined under scanning electron microscopy (SEM). Results The weight percentage of Ca was significantly higher (p<0.05) in Groups 1, 2 and 5 than Groups 4 and 6. No significant differences were detected in the weight percentage of Ca between Group 3 and the other groups (p>0.05). The weight percentage of P was similar among all six groups (p>0.05). F was detected on 65% of Group 6 surfaces. SEM images of Groups 4 and 6 showed signs of demineralization, while Group 5 did not. Conclusions Profluorid application acts as a barrier against the demineralization of interproximally reduced enamel. PMID:28430810

  15. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  16. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future.

  17. Manipulator Performance Evaluation Using Fitts' Taping Task

    SciTech Connect

    Draper, J.V.; Jared, B.C.; Noakes, M.W.

    1999-04-25

    Metaphorically, a teleoperator with master controllers projects the user's arms and hands into a re- mote area, Therefore, human users interact with teleoperators at a more fundamental level than they do with most human-machine systems. Instead of inputting decisions about how the system should func- tion, teleoperator users input the movements they might make if they were truly in the remote area and the remote machine must recreate their trajectories and impedance. This intense human-machine inter- action requires displays and controls more carefully attuned to human motor capabilities than is neces- sary with most systems. It is important for teleoperated manipulators to be able to recreate human trajectories and impedance in real time. One method for assessing manipulator performance is to observe how well a system be- haves while a human user completes human dexterity tasks with it. Fitts' tapping task has been, used many times in the past for this purpose. This report describes such a performance assessment. The International Submarine Engineering (ISE) Autonomous/Teleoperated Operations Manipulator (ATOM) servomanipulator system was evalu- ated using a generic positioning accuracy task. The task is a simple one but has the merits of (1) pro- ducing a performance function estimate rather than a point estimate and (2) being widely used in the past for human and servomanipulator dexterity tests. Results of testing using this task may, therefore, allow comparison with other manipulators, and is generically representative of a broad class of tasks. Results of the testing indicate that the ATOM manipulator is capable of performing the task. Force reflection had a negative impact on task efficiency in these data. This was most likely caused by the high resistance to movement the master controller exhibited with the force reflection engaged. Measurements of exerted forces were not made, so it is not possible to say whether the force reflection helped partici- pants

  18. Quantitative evaluation of CART-containing cells in urinary bladder of rats with renovascular hypertension.

    PubMed

    Janiuk, I; Kasacka, I

    2015-04-13

    Recent biological advances make it possible to discover new peptides associated with hypertension. The cocaine- and amphetamine-regulated transcript (CART) is a known factor in appetite and feeding behaviour. Various lines of evidence suggest that this peptide participates not only in control of feeding behaviour but also in the regulation of the cardiovascular and sympathetic systems and blood pressure. The role of CART in blood pressure regulation led us to undertake a study aimed at analysing quantitative changes in CART-containing cells in urinary bladders (UB) of rats with renovascular hypertension. We used the Goldblatt model of arterial hypertension (two-kidney, one clip) to evaluate quantitative changes. This model provides researchers with a commonly used tool to analyse the renin-angiotensin system of blood pressure control and, eventually, to develop drugs for the treatment of chronic hypertension. The study was performed on sections of urinary bladders of rats after 3-, 14-, 28-, 42 and 91 days from hypertension induction. Immunohistochemical identification of CART cells was performed on paraffin for the UBs of all the study animals. CART was detected in the endocrine cells, especially numerous in the submucosa and muscularis layers, with a few found in the transitional epithelium and only occasionally in serosa. Hypertension significantly increased the number of CART-positive cells in the rat UBs. After 3 and 42 days following the procedure, statistically significantly higher numbers of CART-positive cells were identified in comparison with the control animals. The differences between the hypertensive rats and the control animals concerned not only the number density of CART-immunoreactive cells but also their localization. After a 6-week period, each of the rats subjected to the renal artery clipping procedure developed stable hypertension. CART appeared in numerous transitional epithelium cells. As this study provides novel findings, the question

  19. Quantitative Evaluation of CART-Containing Cells in Urinary Bladder of Rats with Renovascular Hypertension

    PubMed Central

    Janiuk, I.; Kasacka, I.

    2015-01-01

    Recent biological advances make it possible to discover new peptides associated with hypertension. The cocaine- and amphetamine-regulated transcript (CART) is a known factor in appetite and feeding behaviour. Various lines of evidence suggest that this peptide participates not only in control of feeding behaviour but also in the regulation of the cardiovascular and sympathetic systems and blood pressure. The role of CART in blood pressure regulation led us to undertake a study aimed at analysing quantitative changes in CART-containing cells in urinary bladders (UB) of rats with renovascular hypertension. We used the Goldblatt model of arterial hypertension (two-kidney, one clip) to evaluate quantitative changes. This model provides researchers with a commonly used tool to analyse the renin-angiotensin system of blood pressure control and, eventually, to develop drugs for the treatment of chronic hypertension. The study was performed on sections of urinary bladders of rats after 3-, 14-, 28-, 42 and 91 days from hypertension induction. Immunohistochemical identification of CART cells was performed on paraffin for the UBs of all the study animals. CART was detected in the endocrine cells, especially numerous in the submucosa and muscularis layers, with a few found in the transitional epithelium and only occasionally in serosa. Hypertension significantly increased the number of CART-positive cells in the rat UBs. After 3 and 42 days following the procedure, statistically significantly higher numbers of CART-positive cells were identified in comparison with the control animals. The differences between the hypertensive rats and the control animals concerned not only the number density of CART-immunoreactive cells but also their localization. After a 6-week period, each of the rats subjected to the renal artery clipping procedure developed stable hypertension. CART appeared in numerous transitional epithelium cells. As this study provides novel findings, the question

  20. Performance Evaluation of the SPT-140

    NASA Technical Reports Server (NTRS)

    Manzella, David; Sarmiento, Charles; Sankovic, John; Haag, Tom

    1997-01-01

    As part of an on-going cooperative program with industry, an engineering model SPT-140 Hall thruster, which may be suitable for orbit insertion and station-keeping of geosynchronous communication satellites, was evaluated with respect to thrust and radiated electromagnetic interference at the NASA Lewis Research Center. Performance measurements were made using a laboratory model propellant feed system and commercial power supplies. The engine was operated in a space simulation chamber capable of providing background pressures of 4 x 10(exp -6) Torr or less during thruster operation. Thrust was measured at input powers ranging from 1.5 to 5 kilowatts with two different output filter configurations. The broadband electromagnetic emission spectra generated by the engine was also measured for a range of frequencies from 0.01 to 18,000 Mhz. These results are compared to the noise threshold of the measurement system and MIL-STD-461C where appropriate.